Project Discription - To detect Pneumonia we need to detect Inflammation of the lungs. In this project the goal is to build a pneumonia detection system, to locate the position of inflammation in CXR image. Automating Pneumonia screening in chest radiographs, providing affected area details through bounding box. Assist physicians to make better clinical decisions or even replace human judgement in certain functional areas of healthcare (eg, radiology). Guided by relevant clinical questions, powerful AI techniques can unlock clinically relevant information hidden in the massive amount of data, which in turn can assist clinical decision making
Objective of Project - The objective of this project is to build an algorithm to locate the position of inflammation in a medical image. The algorithm needs to locate lung opacities on chest radiographs automatically The objective of the project is, • Build an Object Detection Model • Use transfer learning to fine-tune a model. • Set the optimizers, loss functions, epochs, learning rate, batch size, checkpointing, early stopping etc.
# import required libraries
import os as os
from PIL import Image
from array import array
import cv2
from glob import glob
from matplotlib import pyplot as plt
%matplotlib inline
import numpy as np
import pandas as pd
import seaborn as sns
import warnings
warnings.filterwarnings('ignore')
from tensorflow.keras.models import Sequential
from tensorflow.keras.layers import Conv2D
from tensorflow.keras.layers import MaxPool2D
from tensorflow.keras.layers import BatchNormalization
from tensorflow.keras import optimizers
from tensorflow.keras.optimizers import Adam
from tensorflow.keras import regularizers
from tensorflow.keras import layers
from matplotlib.patches import Rectangle
import gc
from tensorflow.keras.models import Model
from tensorflow.keras.layers import Input, Concatenate, Dense, Dropout, Flatten, Activation
from tensorflow.keras.callbacks import EarlyStopping
from tensorflow.keras.preprocessing.image import ImageDataGenerator
from google.colab import drive
drive.mount('/content/drive')
Mounted at /content/drive
# Load dataset
detail_class_info_data = pd.read_csv('/content/drive/MyDrive/Raw Data/stage_2_detailed_class_info.csv')
# Load train dataset
train_labels_data = pd.read_csv('/content/drive/MyDrive/Raw Data/stage_2_train_labels.csv')
#Define Path for Images
train_path = '/content/drive/MyDrive/Raw Data/stage_2_train_images/'
test_path = '/content/drive/MyDrive/Raw Data/stage_2_test_images/'
#This data is in CSV format containing the unique Patient ID and Target class i.e. No Lung Opacity / Not Normal, Normal & Lung Opacity.
detail_class_info_data.head(10)
| patientId | class | |
|---|---|---|
| 0 | 0004cfab-14fd-4e49-80ba-63a80b6bddd6 | No Lung Opacity / Not Normal |
| 1 | 00313ee0-9eaa-42f4-b0ab-c148ed3241cd | No Lung Opacity / Not Normal |
| 2 | 00322d4d-1c29-4943-afc9-b6754be640eb | No Lung Opacity / Not Normal |
| 3 | 003d8fa0-6bf1-40ed-b54c-ac657f8495c5 | Normal |
| 4 | 00436515-870c-4b36-a041-de91049b9ab4 | Lung Opacity |
| 5 | 00436515-870c-4b36-a041-de91049b9ab4 | Lung Opacity |
| 6 | 00569f44-917d-4c86-a842-81832af98c30 | No Lung Opacity / Not Normal |
| 7 | 006cec2e-6ce2-4549-bffa-eadfcd1e9970 | No Lung Opacity / Not Normal |
| 8 | 00704310-78a8-4b38-8475-49f4573b2dbb | Lung Opacity |
| 9 | 00704310-78a8-4b38-8475-49f4573b2dbb | Lung Opacity |
#This data is in CSV format containing the unique Patient ID and Target class
#The CSV file contains Patient Id, bounding box details with (x,y) coordinates and width and height of that box. It also contains the Target variable. For target variable 0, the bounding box values has NaN values that means there is no pneumonia in the image whereas for target variable 1, that means there is there are symptoms of pneumonia.
#If we look closely, there are duplicate entries for Patient Id in the csv files. We can observe row #4 and #5, row #8 and #9 have same Patient Id values, that means the patient is identified with pneumonia at multiple areas in lungs
train_labels_data.head(10)
| patientId | x | y | width | height | Target | |
|---|---|---|---|---|---|---|
| 0 | 0004cfab-14fd-4e49-80ba-63a80b6bddd6 | NaN | NaN | NaN | NaN | 0 |
| 1 | 00313ee0-9eaa-42f4-b0ab-c148ed3241cd | NaN | NaN | NaN | NaN | 0 |
| 2 | 00322d4d-1c29-4943-afc9-b6754be640eb | NaN | NaN | NaN | NaN | 0 |
| 3 | 003d8fa0-6bf1-40ed-b54c-ac657f8495c5 | NaN | NaN | NaN | NaN | 0 |
| 4 | 00436515-870c-4b36-a041-de91049b9ab4 | 264.0 | 152.0 | 213.0 | 379.0 | 1 |
| 5 | 00436515-870c-4b36-a041-de91049b9ab4 | 562.0 | 152.0 | 256.0 | 453.0 | 1 |
| 6 | 00569f44-917d-4c86-a842-81832af98c30 | NaN | NaN | NaN | NaN | 0 |
| 7 | 006cec2e-6ce2-4549-bffa-eadfcd1e9970 | NaN | NaN | NaN | NaN | 0 |
| 8 | 00704310-78a8-4b38-8475-49f4573b2dbb | 323.0 | 577.0 | 160.0 | 104.0 | 1 |
| 9 | 00704310-78a8-4b38-8475-49f4573b2dbb | 695.0 | 575.0 | 162.0 | 137.0 | 1 |
#Concatenate both the dataset
data = pd.concat([train_labels_data,detail_class_info_data["class"]],axis=1,sort=False)
data = data.drop_duplicates()
data.info()
<class 'pandas.core.frame.DataFrame'> Int64Index: 30227 entries, 0 to 30226 Data columns (total 7 columns): # Column Non-Null Count Dtype --- ------ -------------- ----- 0 patientId 30227 non-null object 1 x 9555 non-null float64 2 y 9555 non-null float64 3 width 9555 non-null float64 4 height 9555 non-null float64 5 Target 30227 non-null int64 6 class 30227 non-null object dtypes: float64(4), int64(1), object(2) memory usage: 1.8+ MB
#checkling Null values
data.isnull().sum()
patientId 0 x 20672 y 20672 width 20672 height 20672 Target 0 class 0 dtype: int64
data.head()
| patientId | x | y | width | height | Target | class | |
|---|---|---|---|---|---|---|---|
| 0 | 0004cfab-14fd-4e49-80ba-63a80b6bddd6 | NaN | NaN | NaN | NaN | 0 | No Lung Opacity / Not Normal |
| 1 | 00313ee0-9eaa-42f4-b0ab-c148ed3241cd | NaN | NaN | NaN | NaN | 0 | No Lung Opacity / Not Normal |
| 2 | 00322d4d-1c29-4943-afc9-b6754be640eb | NaN | NaN | NaN | NaN | 0 | No Lung Opacity / Not Normal |
| 3 | 003d8fa0-6bf1-40ed-b54c-ac657f8495c5 | NaN | NaN | NaN | NaN | 0 | Normal |
| 4 | 00436515-870c-4b36-a041-de91049b9ab4 | 264.0 | 152.0 | 213.0 | 379.0 | 1 | Lung Opacity |
data.info()
<class 'pandas.core.frame.DataFrame'> Int64Index: 30227 entries, 0 to 30226 Data columns (total 7 columns): # Column Non-Null Count Dtype --- ------ -------------- ----- 0 patientId 30227 non-null object 1 x 9555 non-null float64 2 y 9555 non-null float64 3 width 9555 non-null float64 4 height 9555 non-null float64 5 Target 30227 non-null int64 6 class 30227 non-null object dtypes: float64(4), int64(1), object(2) memory usage: 1.8+ MB
data.describe()
| x | y | width | height | Target | |
|---|---|---|---|---|---|
| count | 9555.000000 | 9555.000000 | 9555.000000 | 9555.000000 | 30227.000000 |
| mean | 394.047724 | 366.839560 | 218.471376 | 329.269702 | 0.316108 |
| std | 204.574172 | 148.940488 | 59.289475 | 157.750755 | 0.464963 |
| min | 2.000000 | 2.000000 | 40.000000 | 45.000000 | 0.000000 |
| 25% | 207.000000 | 249.000000 | 177.000000 | 203.000000 | 0.000000 |
| 50% | 324.000000 | 365.000000 | 217.000000 | 298.000000 | 0.000000 |
| 75% | 594.000000 | 478.500000 | 259.000000 | 438.000000 | 1.000000 |
| max | 835.000000 | 881.000000 | 528.000000 | 942.000000 | 1.000000 |
sns.countplot(x="class", hue="class", data=data)
<matplotlib.axes._subplots.AxesSubplot at 0x7fe9867d5750>
Above graph show the count of patients with different class • No Lung Opacity / Not Normal – 11821 • Normal – 8851 • Lung Opacity – 9555
data["Target"].value_counts().plot(kind='bar', subplots=False)
<matplotlib.axes._subplots.AxesSubplot at 0x7fe9e4736f90>
Above graph show the count of patients with different target
• Target 0 (No Pneumonia) – 20672
• Target 1 (Pneumonia) – 9555
We can also infer that the data set is imbalanced for Target 0 & 1.
pd.pivot_table(data,index=["class"], values=['patientId'], aggfunc='count')
| patientId | |
|---|---|
| class | |
| Lung Opacity | 9555 |
| No Lung Opacity / Not Normal | 11821 |
| Normal | 8851 |
# Check how many patients has bounding box co-ordinates
data.groupby(['Target']).count()
| patientId | x | y | width | height | class | |
|---|---|---|---|---|---|---|
| Target | ||||||
| 0 | 20672 | 0 | 0 | 0 | 0 | 20672 |
| 1 | 9555 | 9555 | 9555 | 9555 | 9555 | 9555 |
#Analyse the % of Class
data["class"].value_counts().plot(kind='pie',autopct='%1.0f%%', shadow=True, subplots=False)
<matplotlib.axes._subplots.AxesSubplot at 0x7fe9e48f6490>
#Count of patients having single row and more than single rows
data['patientId'].value_counts().value_counts()
1 23286 2 3266 3 119 4 13 Name: patientId, dtype: int64
#Patients who do not have pneumonia has only one record in the table
data[data['Target'] == 0]['patientId'].value_counts().value_counts()
1 20672 Name: patientId, dtype: int64
#Replace NAN value with 0.0 with the help of fillna function
data.fillna(0.0)
| patientId | x | y | width | height | Target | class | |
|---|---|---|---|---|---|---|---|
| 0 | 0004cfab-14fd-4e49-80ba-63a80b6bddd6 | 0.0 | 0.0 | 0.0 | 0.0 | 0 | No Lung Opacity / Not Normal |
| 1 | 00313ee0-9eaa-42f4-b0ab-c148ed3241cd | 0.0 | 0.0 | 0.0 | 0.0 | 0 | No Lung Opacity / Not Normal |
| 2 | 00322d4d-1c29-4943-afc9-b6754be640eb | 0.0 | 0.0 | 0.0 | 0.0 | 0 | No Lung Opacity / Not Normal |
| 3 | 003d8fa0-6bf1-40ed-b54c-ac657f8495c5 | 0.0 | 0.0 | 0.0 | 0.0 | 0 | Normal |
| 4 | 00436515-870c-4b36-a041-de91049b9ab4 | 264.0 | 152.0 | 213.0 | 379.0 | 1 | Lung Opacity |
| ... | ... | ... | ... | ... | ... | ... | ... |
| 30222 | c1ec14ff-f6d7-4b38-b0cb-fe07041cbdc8 | 185.0 | 298.0 | 228.0 | 379.0 | 1 | Lung Opacity |
| 30223 | c1edf42b-5958-47ff-a1e7-4f23d99583ba | 0.0 | 0.0 | 0.0 | 0.0 | 0 | Normal |
| 30224 | c1f6b555-2eb1-4231-98f6-50a963976431 | 0.0 | 0.0 | 0.0 | 0.0 | 0 | Normal |
| 30225 | c1f7889a-9ea9-4acb-b64c-b737c929599a | 570.0 | 393.0 | 261.0 | 345.0 | 1 | Lung Opacity |
| 30226 | c1f7889a-9ea9-4acb-b64c-b737c929599a | 233.0 | 424.0 | 201.0 | 356.0 | 1 | Lung Opacity |
30227 rows × 7 columns
sns.pairplot(data,palette='husl')
<seaborn.axisgrid.PairGrid at 0x7fe9e482bc50>
Above Pair Plot help us to understand the positive and negative correlation between all attributes. As per the below plot, we are able to see few attributes have positive and negative correlation To understand the values of correlation we need to perform correlation coefficient
data.corr()
| x | y | width | height | Target | |
|---|---|---|---|---|---|
| x | 1.000000 | 0.007604 | -0.058665 | 0.008256 | NaN |
| y | 0.007604 | 1.000000 | -0.299897 | -0.645369 | NaN |
| width | -0.058665 | -0.299897 | 1.000000 | 0.597461 | NaN |
| height | 0.008256 | -0.645369 | 0.597461 | 1.000000 | NaN |
| Target | NaN | NaN | NaN | NaN | 1.0 |
corr=data.corr()
plt.figure(figsize=(8,8))
sns.heatmap(corr,annot=True)
<matplotlib.axes._subplots.AxesSubplot at 0x7fe9842ea6d0>
Above Heat map will help us to understand the values of correlation coefficient. • There is a slight positive co-relation between height and width in the dataset i.e., 0.6. • There is also a slight negative co-relation between height and y i.e., -0.65
sns.jointplot(x = 'width', y = 'height', data = data, kind="reg")
<seaborn.axisgrid.JointGrid at 0x7fe98296ee50>
sns.jointplot(x = 'y', y = 'height', data = data, kind="reg")
<seaborn.axisgrid.JointGrid at 0x7fe982765250>
We need to deep dive in co-relation, so we visualize Joint plot on height & width and height & Y. Regression line also show that there is positive and negative corelation respectively. Above graphs also shows the distribution of attributes.
!pip install pydicom
Requirement already satisfied: pydicom in /usr/local/lib/python3.7/dist-packages (2.1.2)
import pydicom
import pydicom as dcm
from pydicom import dcmread
import glob
train_image = glob.glob(train_path + '*.dcm')
test_image = glob.glob(test_path + '*.dcm')
print('Number of images in train image list: ', len(train_image))
print('Number of images in test image list: ', len(test_image))
Number of images in train image list: 25685 Number of images in test image list: 3001
dicom_df=data
dicom_df.shape
(30227, 7)
from tqdm import tqdm
from pathlib import Path
def process_dicom_data(data_df):
for n, pid in tqdm(enumerate(data_df['patientId'].unique())):
dcm_file = train_path + '%s.dcm' % pid
path = Path(dcm_file)
if path.is_file():
dcm_data = pydicom.read_file(dcm_file)
idx = (data_df['patientId']==dcm_data.PatientID)
data_df.loc[idx,'Modality'] = dcm_data.Modality
data_df.loc[idx,'PatientAge'] = pd.to_numeric(dcm_data.PatientAge)
data_df.loc[idx,'PatientSex'] = dcm_data.PatientSex
data_df.loc[idx,'BodyPartExamined'] = dcm_data.BodyPartExamined
data_df.loc[idx,'ViewPosition'] = dcm_data.ViewPosition
return data_df
# We have run below code to fetch the various attributes of images and to save the time for next code execution, we covert into CSV format and saved in local directry. Fron next time we can directly read the csv from directory. This will save time from reading the data again and again.
#dicom_data = process_dicom_data(data)
#dicom_data.to_csv('C:/Users/Z004339P/Desktop/Imp Training/PG AIML/All Course Content/Submitted Projects/Capstone Project/Capstone Project/dicom.csv')
dicom_data = pd.read_csv('/content/drive/MyDrive/Raw Data/dicom.csv')
dicom_data.nunique()
Unnamed: 0 30227 patientId 26684 x 748 y 726 width 351 height 725 Target 2 class 3 Modality 1 PatientAge 97 PatientSex 2 BodyPartExamined 1 ViewPosition 2 dtype: int64
dicom_data.describe()
| Unnamed: 0 | x | y | width | height | Target | PatientAge | |
|---|---|---|---|---|---|---|---|
| count | 30227.00000 | 9555.000000 | 9555.000000 | 9555.000000 | 9555.000000 | 30227.000000 | 29145.000000 |
| mean | 15113.00000 | 394.047724 | 366.839560 | 218.471376 | 329.269702 | 0.316108 | 46.791765 |
| std | 8725.92763 | 204.574172 | 148.940488 | 59.289475 | 157.750755 | 0.464963 | 16.902249 |
| min | 0.00000 | 2.000000 | 2.000000 | 40.000000 | 45.000000 | 0.000000 | 1.000000 |
| 25% | 7556.50000 | 207.000000 | 249.000000 | 177.000000 | 203.000000 | 0.000000 | 34.000000 |
| 50% | 15113.00000 | 324.000000 | 365.000000 | 217.000000 | 298.000000 | 0.000000 | 49.000000 |
| 75% | 22669.50000 | 594.000000 | 478.500000 | 259.000000 | 438.000000 | 1.000000 | 59.000000 |
| max | 30226.00000 | 835.000000 | 881.000000 | 528.000000 | 942.000000 | 1.000000 | 155.000000 |
dcm_file = train_path + '0004cfab-14fd-4e49-80ba-63a80b6bddd6.dcm'
dcm_data = pydicom.read_file(dcm_file)
Read the CXR images and Extract one image and process the DICOM information. It is observed that some useful information is available in the DICOM metadata with predictive values, for example: • Patient sex, Patient age, Modality, View position, Rows & Columns, Pixel Spacing • The actual image of the CXR report is present in the last element tagged as Pixel data which is of array format. • All the remaining tags or elements are metadata providing additional details
dcm_data
Dataset.file_meta ------------------------------- (0002, 0000) File Meta Information Group Length UL: 202 (0002, 0001) File Meta Information Version OB: b'\x00\x01' (0002, 0002) Media Storage SOP Class UID UI: Secondary Capture Image Storage (0002, 0003) Media Storage SOP Instance UID UI: 1.2.276.0.7230010.3.1.4.8323329.28530.1517874485.775526 (0002, 0010) Transfer Syntax UID UI: JPEG Baseline (Process 1) (0002, 0012) Implementation Class UID UI: 1.2.276.0.7230010.3.0.3.6.0 (0002, 0013) Implementation Version Name SH: 'OFFIS_DCMTK_360' ------------------------------------------------- (0008, 0005) Specific Character Set CS: 'ISO_IR 100' (0008, 0016) SOP Class UID UI: Secondary Capture Image Storage (0008, 0018) SOP Instance UID UI: 1.2.276.0.7230010.3.1.4.8323329.28530.1517874485.775526 (0008, 0020) Study Date DA: '19010101' (0008, 0030) Study Time TM: '000000.00' (0008, 0050) Accession Number SH: '' (0008, 0060) Modality CS: 'CR' (0008, 0064) Conversion Type CS: 'WSD' (0008, 0090) Referring Physician's Name PN: '' (0008, 103e) Series Description LO: 'view: PA' (0010, 0010) Patient's Name PN: '0004cfab-14fd-4e49-80ba-63a80b6bddd6' (0010, 0020) Patient ID LO: '0004cfab-14fd-4e49-80ba-63a80b6bddd6' (0010, 0030) Patient's Birth Date DA: '' (0010, 0040) Patient's Sex CS: 'F' (0010, 1010) Patient's Age AS: '51' (0018, 0015) Body Part Examined CS: 'CHEST' (0018, 5101) View Position CS: 'PA' (0020, 000d) Study Instance UID UI: 1.2.276.0.7230010.3.1.2.8323329.28530.1517874485.775525 (0020, 000e) Series Instance UID UI: 1.2.276.0.7230010.3.1.3.8323329.28530.1517874485.775524 (0020, 0010) Study ID SH: '' (0020, 0011) Series Number IS: "1" (0020, 0013) Instance Number IS: "1" (0020, 0020) Patient Orientation CS: '' (0028, 0002) Samples per Pixel US: 1 (0028, 0004) Photometric Interpretation CS: 'MONOCHROME2' (0028, 0010) Rows US: 1024 (0028, 0011) Columns US: 1024 (0028, 0030) Pixel Spacing DS: [0.14300000000000002, 0.14300000000000002] (0028, 0100) Bits Allocated US: 8 (0028, 0101) Bits Stored US: 8 (0028, 0102) High Bit US: 7 (0028, 0103) Pixel Representation US: 0 (0028, 2110) Lossy Image Compression CS: '01' (0028, 2114) Lossy Image Compression Method CS: 'ISO_10918_1' (7fe0, 0010) Pixel Data OB: Array of 142006 elements
dicom_data_new = dicom_data.drop('Target', axis=1)
dicom_data_new['PatientSex'].astype('category')
dicom_data_new['ViewPosition'].astype('category')
dicom_data_new['PatientSex'] = np.where(dicom_data_new["PatientSex"].str.contains("M"), 1, 0)
dicom_data_new['ViewPosition'] = np.where(dicom_data_new["ViewPosition"].str.contains("AP"), 1, 0)
dicom_data_new.head()
| Unnamed: 0 | patientId | x | y | width | height | class | Modality | PatientAge | PatientSex | BodyPartExamined | ViewPosition | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 0 | 0 | 0004cfab-14fd-4e49-80ba-63a80b6bddd6 | NaN | NaN | NaN | NaN | No Lung Opacity / Not Normal | CR | 51.0 | 0 | CHEST | 0 |
| 1 | 1 | 00313ee0-9eaa-42f4-b0ab-c148ed3241cd | NaN | NaN | NaN | NaN | No Lung Opacity / Not Normal | CR | 48.0 | 0 | CHEST | 0 |
| 2 | 2 | 00322d4d-1c29-4943-afc9-b6754be640eb | NaN | NaN | NaN | NaN | No Lung Opacity / Not Normal | CR | 19.0 | 1 | CHEST | 1 |
| 3 | 3 | 003d8fa0-6bf1-40ed-b54c-ac657f8495c5 | NaN | NaN | NaN | NaN | Normal | CR | 28.0 | 1 | CHEST | 0 |
| 4 | 4 | 00436515-870c-4b36-a041-de91049b9ab4 | 264.0 | 152.0 | 213.0 | 379.0 | Lung Opacity | CR | 32.0 | 0 | CHEST | 1 |
dicom_data_new.corr()
| Unnamed: 0 | x | y | width | height | PatientAge | PatientSex | ViewPosition | |
|---|---|---|---|---|---|---|---|---|
| Unnamed: 0 | 1.000000 | 0.016929 | 0.030672 | -0.117766 | -0.068564 | -0.003701 | 0.053081 | 0.053706 |
| x | 0.016929 | 1.000000 | 0.007604 | -0.058665 | 0.008256 | -0.015204 | -0.016444 | 0.022251 |
| y | 0.030672 | 0.007604 | 1.000000 | -0.299897 | -0.645369 | 0.100636 | 0.052027 | -0.122808 |
| width | -0.117766 | -0.058665 | -0.299897 | 1.000000 | 0.597461 | 0.027430 | 0.112876 | 0.085789 |
| height | -0.068564 | 0.008256 | -0.645369 | 0.597461 | 1.000000 | -0.002153 | 0.051168 | 0.273153 |
| PatientAge | -0.003701 | -0.015204 | 0.100636 | 0.027430 | -0.002153 | 1.000000 | -0.009945 | -0.058790 |
| PatientSex | 0.053081 | -0.016444 | 0.052027 | 0.112876 | 0.051168 | -0.009945 | 1.000000 | 0.044859 |
| ViewPosition | 0.053706 | 0.022251 | -0.122808 | 0.085789 | 0.273153 | -0.058790 | 0.044859 | 1.000000 |
#Locating the penumonia location
dicom_data['x_center']=dicom_data['x'] + dicom_data['width']/2
dicom_data['y_center']=dicom_data['y'] + dicom_data['height']/2
# Plot x and y centers
sns.jointplot("x_center", "y_center",kind="kde", data=dicom_data, height=6, alpha=0.5)
plt.suptitle('Pneumonia location')
Text(0.5, 0.98, 'Pneumonia location')
Above contour & joint plot graphs shows the position of Pneumonia in lungs and shows number of areas effected in on CXR image. In below graphs there are 2 area effected from pneumonia
plt.figure(figsize = (30, 10))
sns.countplot(x = 'PatientAge', hue = 'Target', data = dicom_data)
<matplotlib.axes._subplots.AxesSubplot at 0x7fe981dae4d0>
Above graph shows the Patient's age proportion in the pneumonia detection. Orange colour graphs show the pneumonia patients, Maximum data lies in between 33 years to 59 years. This also gives inference that majority of young people are infected by pneumonia.
sns.countplot(x = 'PatientSex', hue = 'Target', data = dicom_data)
<matplotlib.axes._subplots.AxesSubplot at 0x7fe98188c050>
sns.countplot(x = 'ViewPosition', hue = 'Target', data = dicom_data)
<matplotlib.axes._subplots.AxesSubplot at 0x7fe9816f6b10>
Above graph shows the Patient's Sex & View Position proportion in the pneumonia detection. Orange colour graphs show the pneumonia patients and blue shows normal cases, • Patient Sex Graph – Cases of Pneumonia are more in Male than Female as per dataset. • View Position Graph - Cases of Pneumonia detection though different angle position view of AP is more than view position of PA.
def show_dicom_image(data_df):
img_data = list(data_df.T.to_dict().values())
f, ax = plt.subplots(2,2, figsize=(16,18))
for i,data_row in enumerate(img_data):
pid = data_row['patientId']
dcm_file = train_path + '%s.dcm' % pid
dcm_data = pydicom.read_file(dcm_file)
ax[i//2, i%2].imshow(dcm_data.pixel_array, cmap=plt.cm.bone)
ax[i//2, i%2].set_title('ID: {}\n Age: {} Sex: {}'.format(
data_row['patientId'],dcm_data.PatientAge, dcm_data.PatientSex))
show_dicom_image(dicom_data[dicom_data['Target']==1].sample(n=4))
Showing some random dicom images of a patient who do not have Pnuemonia, however with class No Lung Opacity / Not Normal
show_dicom_image(dicom_data[ (dicom_data['Target']==0) & (dicom_data['class']=='No Lung Opacity / Not Normal')].sample(n=4))
Showing some random dicom images of a patients who do not have Pnuemonia, however with class Normal
show_dicom_image(dicom_data[ (dicom_data['Target']==0) & (dicom_data['class']=='Normal')].sample(n=4))
def show_dicome_with_boundingbox(data_df):
img_data = list(data_df.T.to_dict().values())
f, ax = plt.subplots(2,2, figsize=(16,18))
for i,data_row in enumerate(img_data):
pid = data_row['patientId']
dcm_file = train_path + '%s.dcm' % pid
dcm_data = pydicom.read_file(dcm_file)
ax[i//2, i%2].imshow(dcm_data.pixel_array, cmap=plt.cm.bone)
ax[i//2, i%2].set_title('ID: {}\n Age: {} Sex: {}'.format(
data_row['patientId'],dcm_data.PatientAge, dcm_data.PatientSex))
rows = data_df[data_df['patientId']==data_row['patientId']]
box_data = list(rows.T.to_dict().values())
for j, row in enumerate(box_data):
x,y,width,height = row['x'], row['y'],row['width'],row['height']
rectangle = Rectangle(xy=(x,y),width=width, height=height, color="red",alpha = 0.1)
ax[i//2, i%2].add_patch(rectangle)
show_dicome_with_boundingbox(dicom_data[dicom_data['Target']==1].sample(n=4))
import os
import csv
import random
import pydicom
import numpy as np
import pandas as pd
from skimage import measure
from skimage.transform import resize
import matplotlib.patches as patches
import tensorflow as tf
from tensorflow import keras
Load pneumonia locations
Table contains [filename : pneumonia location] pairs per row.
If a filename contains multiple pneumonia, the table contains multiple rows with the same filename but different pneumonia locations. If a filename contains no pneumonia it contains a single row with an empty pneumonia location. The code below loads the table and transforms it into a dictionary.
The dictionary uses the filename as key and a list of pneumonia locations in that filename as value. If a filename is not present in the dictionary it means that it contains no pneumonia.
pneumonia_locations = {}
# load table
with open(os.path.join('/content/drive/MyDrive/Raw Data/stage_2_train_labels.csv'), mode='r') as infile:
# open reader
reader = csv.reader(infile)
# skip header
next(reader, None)
# loop through rows
for rows in reader:
# retrieve information
filename = rows[0]
location = rows[1:5]
pneumonia = rows[5]
# if row contains pneumonia add label to dictionary
# which contains a list of pneumonia locations per filename
if pneumonia == '1':
# convert string to float to int
location = [int(float(i)) for i in location]
# save pneumonia location in dictionary
if filename in pneumonia_locations:
pneumonia_locations[filename].append(location)
else:
pneumonia_locations[filename] = [location]
Load image filenames
# load and shuffle filenames
folder = train_path
filenames = os.listdir(folder)
random.shuffle(filenames)
# split into train and validation filenames
n_valid_samples = 2560
train_filenames = filenames[n_valid_samples:]
valid_filenames = filenames[:n_valid_samples]
print('n train samples', len(train_filenames))
print('n valid samples', len(valid_filenames))
n_train_samples = len(filenames) - n_valid_samples
n train samples 23125 n valid samples 2560
Data generator
The dataset is too large to fit into memory, so we need to create a generator that loads data on the fly.
The generator takes in some filenames, batch_size and other parameters.
The generator outputs a random batch of numpy images and numpy masks.
class generator(keras.utils.Sequence):
def __init__(self, folder, filenames, pneumonia_locations=None, batch_size=32, image_size=256, shuffle=True, augment=False, predict=False):
self.folder = folder
self.filenames = filenames
self.pneumonia_locations = pneumonia_locations
self.batch_size = batch_size
self.image_size = image_size
self.shuffle = shuffle
self.augment = augment
self.predict = predict
self.on_epoch_end()
def __load__(self, filename):
# load dicom file as numpy array
img = pydicom.dcmread(os.path.join(self.folder, filename)).pixel_array
# create empty mask
msk = np.zeros(img.shape)
# get filename without extension
filename = filename.split('.')[0]
# if image contains pneumonia
if filename in self.pneumonia_locations:
# loop through pneumonia
for location in self.pneumonia_locations[filename]:
# add 1's at the location of the pneumonia
x, y, w, h = location
msk[y:y+h, x:x+w] = 1
# resize both image and mask
img = resize(img, (self.image_size, self.image_size), mode='reflect')
msk = resize(msk, (self.image_size, self.image_size), mode='reflect') > 0.5
# if augment then horizontal flip half the time
if self.augment and random.random() > 0.5:
img = np.fliplr(img)
msk = np.fliplr(msk)
# add trailing channel dimension
img = np.expand_dims(img, -1)
msk = np.expand_dims(msk, -1)
return img, msk
def __loadpredict__(self, filename):
# load dicom file as numpy array
img = pydicom.dcmread(os.path.join(self.folder, filename)).pixel_array
# resize image
img = resize(img, (self.image_size, self.image_size), mode='reflect')
# add trailing channel dimension
img = np.expand_dims(img, -1)
return img
def __getitem__(self, index):
# select batch
filenames = self.filenames[index*self.batch_size:(index+1)*self.batch_size]
# predict mode: return images and filenames
if self.predict:
# load files
imgs = [self.__loadpredict__(filename) for filename in filenames]
# create numpy batch
imgs = np.array(imgs)
return imgs, filenames
# train mode: return images and masks
else:
# load files
items = [self.__load__(filename) for filename in filenames]
# unzip images and masks
imgs, msks = zip(*items)
# create numpy batch
imgs = np.array(imgs)
msks = np.array(msks)
return imgs, msks
def on_epoch_end(self):
if self.shuffle:
random.shuffle(self.filenames)
def __len__(self):
if self.predict:
# return everything
return int(np.ceil(len(self.filenames) / self.batch_size))
else:
# return full batches only
return int(len(self.filenames) / self.batch_size)
# define iou or jaccard loss function
def iou_loss(y_true, y_pred):
#print(y_true)
y_true=tf.cast(y_true, tf.float32)
y_pred=tf.cast(y_pred, tf.float32)
y_true = tf.reshape(y_true, [-1])
y_pred = tf.reshape(y_pred, [-1])
intersection = tf.reduce_sum(y_true * y_pred)
score = (intersection + 1.) / (tf.reduce_sum(y_true) + tf.reduce_sum(y_pred) - intersection + 1.)
return 1 - score
# combine bce loss and iou loss
def iou_bce_loss(y_true, y_pred):
return 0.5 * keras.losses.binary_crossentropy(y_true, y_pred) + 0.5 * iou_loss(y_true, y_pred)
# mean iou as a metric
def mean_iou(y_true, y_pred):
y_pred = tf.round(y_pred)
intersect = tf.reduce_sum(y_true * y_pred, axis=[1, 2, 3])
union = tf.reduce_sum(y_true, axis=[1, 2, 3]) + tf.reduce_sum(y_pred, axis=[1, 2, 3])
smooth = tf.ones(tf.shape(intersect))
return tf.reduce_mean((intersect + smooth) / (union - intersect + smooth))
def create_downsample(channels, inputs):
x = keras.layers.BatchNormalization(momentum=0.9)(inputs)
x = keras.layers.LeakyReLU(0)(x)
x = keras.layers.Conv2D(channels, 1, padding='same', use_bias=False)(x)
x = keras.layers.MaxPool2D(2)(x)
return x
def create_resblock(channels, inputs):
x = keras.layers.BatchNormalization(momentum=0.9)(inputs)
x = keras.layers.LeakyReLU(0)(x)
x = keras.layers.Conv2D(channels, 3, padding='same', use_bias=False)(x)
x = keras.layers.BatchNormalization(momentum=0.9)(x)
x = keras.layers.LeakyReLU(0)(x)
x = keras.layers.Conv2D(channels, 3, padding='same', use_bias=False)(x)
return keras.layers.add([x, inputs])
def create_network(input_size, channels, n_blocks=2, depth=4):
# input
inputs = keras.Input(shape=(input_size, input_size, 1))
x = keras.layers.Conv2D(channels, 3, padding='same', use_bias=False)(inputs)
# residual blocks
for d in range(depth):
channels = channels * 2
x = create_downsample(channels, x)
for b in range(n_blocks):
x = create_resblock(channels, x)
# output
x = keras.layers.BatchNormalization(momentum=0.9)(x)
x = keras.layers.LeakyReLU(0)(x)
x = keras.layers.Conv2D(1, 1, activation='sigmoid')(x)
outputs = keras.layers.UpSampling2D(2**depth)(x)
model = keras.Model(inputs=inputs, outputs=outputs)
return model
BATCH_SIZE = 128
IMAGE_SIZE = 128
model1 = create_network(input_size=IMAGE_SIZE, channels=32, n_blocks=2, depth=4)
model1.compile(optimizer='adam', loss=iou_bce_loss, metrics=['accuracy', mean_iou])
# cosine learning rate annealing
def cosine_annealing(x):
lr = 0.0001
epochs = 3
return lr*(np.cos(np.pi*x/epochs)+1.)/2
learning_rate = tf.keras.callbacks.LearningRateScheduler(cosine_annealing)
# create train and validation generators
folder = train_path
train_gen = generator(folder, train_filenames, pneumonia_locations, batch_size=BATCH_SIZE, image_size=IMAGE_SIZE, shuffle=True, augment=False, predict=False)
valid_gen = generator(folder, valid_filenames, pneumonia_locations, batch_size=BATCH_SIZE, image_size=IMAGE_SIZE, shuffle=False, predict=False)
print(model1.summary())
Model: "model_1"
__________________________________________________________________________________________________
Layer (type) Output Shape Param # Connected to
==================================================================================================
input_2 (InputLayer) [(None, 128, 128, 1) 0
__________________________________________________________________________________________________
conv2d_22 (Conv2D) (None, 128, 128, 32) 288 input_2[0][0]
__________________________________________________________________________________________________
batch_normalization_21 (BatchNo (None, 128, 128, 32) 128 conv2d_22[0][0]
__________________________________________________________________________________________________
leaky_re_lu_21 (LeakyReLU) (None, 128, 128, 32) 0 batch_normalization_21[0][0]
__________________________________________________________________________________________________
conv2d_23 (Conv2D) (None, 128, 128, 64) 2048 leaky_re_lu_21[0][0]
__________________________________________________________________________________________________
max_pooling2d_4 (MaxPooling2D) (None, 64, 64, 64) 0 conv2d_23[0][0]
__________________________________________________________________________________________________
batch_normalization_22 (BatchNo (None, 64, 64, 64) 256 max_pooling2d_4[0][0]
__________________________________________________________________________________________________
leaky_re_lu_22 (LeakyReLU) (None, 64, 64, 64) 0 batch_normalization_22[0][0]
__________________________________________________________________________________________________
conv2d_24 (Conv2D) (None, 64, 64, 64) 36864 leaky_re_lu_22[0][0]
__________________________________________________________________________________________________
batch_normalization_23 (BatchNo (None, 64, 64, 64) 256 conv2d_24[0][0]
__________________________________________________________________________________________________
leaky_re_lu_23 (LeakyReLU) (None, 64, 64, 64) 0 batch_normalization_23[0][0]
__________________________________________________________________________________________________
conv2d_25 (Conv2D) (None, 64, 64, 64) 36864 leaky_re_lu_23[0][0]
__________________________________________________________________________________________________
add_8 (Add) (None, 64, 64, 64) 0 conv2d_25[0][0]
max_pooling2d_4[0][0]
__________________________________________________________________________________________________
batch_normalization_24 (BatchNo (None, 64, 64, 64) 256 add_8[0][0]
__________________________________________________________________________________________________
leaky_re_lu_24 (LeakyReLU) (None, 64, 64, 64) 0 batch_normalization_24[0][0]
__________________________________________________________________________________________________
conv2d_26 (Conv2D) (None, 64, 64, 64) 36864 leaky_re_lu_24[0][0]
__________________________________________________________________________________________________
batch_normalization_25 (BatchNo (None, 64, 64, 64) 256 conv2d_26[0][0]
__________________________________________________________________________________________________
leaky_re_lu_25 (LeakyReLU) (None, 64, 64, 64) 0 batch_normalization_25[0][0]
__________________________________________________________________________________________________
conv2d_27 (Conv2D) (None, 64, 64, 64) 36864 leaky_re_lu_25[0][0]
__________________________________________________________________________________________________
add_9 (Add) (None, 64, 64, 64) 0 conv2d_27[0][0]
add_8[0][0]
__________________________________________________________________________________________________
batch_normalization_26 (BatchNo (None, 64, 64, 64) 256 add_9[0][0]
__________________________________________________________________________________________________
leaky_re_lu_26 (LeakyReLU) (None, 64, 64, 64) 0 batch_normalization_26[0][0]
__________________________________________________________________________________________________
conv2d_28 (Conv2D) (None, 64, 64, 128) 8192 leaky_re_lu_26[0][0]
__________________________________________________________________________________________________
max_pooling2d_5 (MaxPooling2D) (None, 32, 32, 128) 0 conv2d_28[0][0]
__________________________________________________________________________________________________
batch_normalization_27 (BatchNo (None, 32, 32, 128) 512 max_pooling2d_5[0][0]
__________________________________________________________________________________________________
leaky_re_lu_27 (LeakyReLU) (None, 32, 32, 128) 0 batch_normalization_27[0][0]
__________________________________________________________________________________________________
conv2d_29 (Conv2D) (None, 32, 32, 128) 147456 leaky_re_lu_27[0][0]
__________________________________________________________________________________________________
batch_normalization_28 (BatchNo (None, 32, 32, 128) 512 conv2d_29[0][0]
__________________________________________________________________________________________________
leaky_re_lu_28 (LeakyReLU) (None, 32, 32, 128) 0 batch_normalization_28[0][0]
__________________________________________________________________________________________________
conv2d_30 (Conv2D) (None, 32, 32, 128) 147456 leaky_re_lu_28[0][0]
__________________________________________________________________________________________________
add_10 (Add) (None, 32, 32, 128) 0 conv2d_30[0][0]
max_pooling2d_5[0][0]
__________________________________________________________________________________________________
batch_normalization_29 (BatchNo (None, 32, 32, 128) 512 add_10[0][0]
__________________________________________________________________________________________________
leaky_re_lu_29 (LeakyReLU) (None, 32, 32, 128) 0 batch_normalization_29[0][0]
__________________________________________________________________________________________________
conv2d_31 (Conv2D) (None, 32, 32, 128) 147456 leaky_re_lu_29[0][0]
__________________________________________________________________________________________________
batch_normalization_30 (BatchNo (None, 32, 32, 128) 512 conv2d_31[0][0]
__________________________________________________________________________________________________
leaky_re_lu_30 (LeakyReLU) (None, 32, 32, 128) 0 batch_normalization_30[0][0]
__________________________________________________________________________________________________
conv2d_32 (Conv2D) (None, 32, 32, 128) 147456 leaky_re_lu_30[0][0]
__________________________________________________________________________________________________
add_11 (Add) (None, 32, 32, 128) 0 conv2d_32[0][0]
add_10[0][0]
__________________________________________________________________________________________________
batch_normalization_31 (BatchNo (None, 32, 32, 128) 512 add_11[0][0]
__________________________________________________________________________________________________
leaky_re_lu_31 (LeakyReLU) (None, 32, 32, 128) 0 batch_normalization_31[0][0]
__________________________________________________________________________________________________
conv2d_33 (Conv2D) (None, 32, 32, 256) 32768 leaky_re_lu_31[0][0]
__________________________________________________________________________________________________
max_pooling2d_6 (MaxPooling2D) (None, 16, 16, 256) 0 conv2d_33[0][0]
__________________________________________________________________________________________________
batch_normalization_32 (BatchNo (None, 16, 16, 256) 1024 max_pooling2d_6[0][0]
__________________________________________________________________________________________________
leaky_re_lu_32 (LeakyReLU) (None, 16, 16, 256) 0 batch_normalization_32[0][0]
__________________________________________________________________________________________________
conv2d_34 (Conv2D) (None, 16, 16, 256) 589824 leaky_re_lu_32[0][0]
__________________________________________________________________________________________________
batch_normalization_33 (BatchNo (None, 16, 16, 256) 1024 conv2d_34[0][0]
__________________________________________________________________________________________________
leaky_re_lu_33 (LeakyReLU) (None, 16, 16, 256) 0 batch_normalization_33[0][0]
__________________________________________________________________________________________________
conv2d_35 (Conv2D) (None, 16, 16, 256) 589824 leaky_re_lu_33[0][0]
__________________________________________________________________________________________________
add_12 (Add) (None, 16, 16, 256) 0 conv2d_35[0][0]
max_pooling2d_6[0][0]
__________________________________________________________________________________________________
batch_normalization_34 (BatchNo (None, 16, 16, 256) 1024 add_12[0][0]
__________________________________________________________________________________________________
leaky_re_lu_34 (LeakyReLU) (None, 16, 16, 256) 0 batch_normalization_34[0][0]
__________________________________________________________________________________________________
conv2d_36 (Conv2D) (None, 16, 16, 256) 589824 leaky_re_lu_34[0][0]
__________________________________________________________________________________________________
batch_normalization_35 (BatchNo (None, 16, 16, 256) 1024 conv2d_36[0][0]
__________________________________________________________________________________________________
leaky_re_lu_35 (LeakyReLU) (None, 16, 16, 256) 0 batch_normalization_35[0][0]
__________________________________________________________________________________________________
conv2d_37 (Conv2D) (None, 16, 16, 256) 589824 leaky_re_lu_35[0][0]
__________________________________________________________________________________________________
add_13 (Add) (None, 16, 16, 256) 0 conv2d_37[0][0]
add_12[0][0]
__________________________________________________________________________________________________
batch_normalization_36 (BatchNo (None, 16, 16, 256) 1024 add_13[0][0]
__________________________________________________________________________________________________
leaky_re_lu_36 (LeakyReLU) (None, 16, 16, 256) 0 batch_normalization_36[0][0]
__________________________________________________________________________________________________
conv2d_38 (Conv2D) (None, 16, 16, 512) 131072 leaky_re_lu_36[0][0]
__________________________________________________________________________________________________
max_pooling2d_7 (MaxPooling2D) (None, 8, 8, 512) 0 conv2d_38[0][0]
__________________________________________________________________________________________________
batch_normalization_37 (BatchNo (None, 8, 8, 512) 2048 max_pooling2d_7[0][0]
__________________________________________________________________________________________________
leaky_re_lu_37 (LeakyReLU) (None, 8, 8, 512) 0 batch_normalization_37[0][0]
__________________________________________________________________________________________________
conv2d_39 (Conv2D) (None, 8, 8, 512) 2359296 leaky_re_lu_37[0][0]
__________________________________________________________________________________________________
batch_normalization_38 (BatchNo (None, 8, 8, 512) 2048 conv2d_39[0][0]
__________________________________________________________________________________________________
leaky_re_lu_38 (LeakyReLU) (None, 8, 8, 512) 0 batch_normalization_38[0][0]
__________________________________________________________________________________________________
conv2d_40 (Conv2D) (None, 8, 8, 512) 2359296 leaky_re_lu_38[0][0]
__________________________________________________________________________________________________
add_14 (Add) (None, 8, 8, 512) 0 conv2d_40[0][0]
max_pooling2d_7[0][0]
__________________________________________________________________________________________________
batch_normalization_39 (BatchNo (None, 8, 8, 512) 2048 add_14[0][0]
__________________________________________________________________________________________________
leaky_re_lu_39 (LeakyReLU) (None, 8, 8, 512) 0 batch_normalization_39[0][0]
__________________________________________________________________________________________________
conv2d_41 (Conv2D) (None, 8, 8, 512) 2359296 leaky_re_lu_39[0][0]
__________________________________________________________________________________________________
batch_normalization_40 (BatchNo (None, 8, 8, 512) 2048 conv2d_41[0][0]
__________________________________________________________________________________________________
leaky_re_lu_40 (LeakyReLU) (None, 8, 8, 512) 0 batch_normalization_40[0][0]
__________________________________________________________________________________________________
conv2d_42 (Conv2D) (None, 8, 8, 512) 2359296 leaky_re_lu_40[0][0]
__________________________________________________________________________________________________
add_15 (Add) (None, 8, 8, 512) 0 conv2d_42[0][0]
add_14[0][0]
__________________________________________________________________________________________________
batch_normalization_41 (BatchNo (None, 8, 8, 512) 2048 add_15[0][0]
__________________________________________________________________________________________________
leaky_re_lu_41 (LeakyReLU) (None, 8, 8, 512) 0 batch_normalization_41[0][0]
__________________________________________________________________________________________________
conv2d_43 (Conv2D) (None, 8, 8, 1) 513 leaky_re_lu_41[0][0]
__________________________________________________________________________________________________
up_sampling2d_1 (UpSampling2D) (None, 128, 128, 1) 0 conv2d_43[0][0]
==================================================================================================
Total params: 12,727,969
Trainable params: 12,718,305
Non-trainable params: 9,664
__________________________________________________________________________________________________
None
EPOCHS=3
MULTI_PROCESSING = True
history1 = model1.fit_generator(train_gen, validation_data=valid_gen, callbacks=[learning_rate], epochs=EPOCHS, workers=4, use_multiprocessing=True)
Epoch 1/3 180/180 [==============================] - 2853s 16s/step - loss: 0.5175 - accuracy: 0.9357 - mean_iou: 0.6500 - val_loss: 0.4455 - val_accuracy: 0.9653 - val_mean_iou: 0.7300 Epoch 2/3 180/180 [==============================] - 2305s 13s/step - loss: 0.4382 - accuracy: 0.9669 - mean_iou: 0.7185 - val_loss: 0.4336 - val_accuracy: 0.9693 - val_mean_iou: 0.7369 Epoch 3/3 180/180 [==============================] - 2312s 13s/step - loss: 0.4210 - accuracy: 0.9690 - mean_iou: 0.7304 - val_loss: 0.4252 - val_accuracy: 0.9654 - val_mean_iou: 0.7245
As per Model, we are getting loss: 0.4210 - accuracy: 0.9690 - mean_iou: 0.304 - val_loss: 0.4252 - val_accuracy: 0.9654 - val_mean_iou: 0.7245
Plot Accuracy / Loss
plt.figure(figsize=(12,4))
plt.subplot(131)
plt.plot(history1.epoch, history1.history["loss"], label="Train loss")
plt.plot(history1.epoch, history1.history["val_loss"], label="Valid loss")
plt.legend()
plt.subplot(132)
plt.plot(history1.epoch, history1.history["accuracy"], label="Train accuracy")
plt.plot(history1.epoch, history1.history["val_accuracy"], label="Valid accuracy")
plt.legend()
plt.subplot(133)
plt.plot(history1.epoch, history1.history["mean_iou"], label="Train iou")
plt.plot(history1.epoch, history1.history["val_mean_iou"], label="Valid iou")
plt.legend()
plt.show()
Predict test images
i=0
for imgs, msks in valid_gen:
# predict batch of images
preds = model1.predict(imgs)
# create figure
f, axarr = plt.subplots(4, 8, figsize=(20,15))
axarr = axarr.ravel()
axidx = 0
# loop through batch
for img, msk, pred in zip(imgs, msks, preds):
i=i+1
#exit after 32 images
if i>32:
break
# plot image
axarr[axidx].imshow(img[:, :, 0])
# threshold true mask
comp = msk[:, :, 0] > 0.5
# apply connected components
comp = measure.label(comp)
# apply bounding boxes
predictionString = ''
for region in measure.regionprops(comp):
# retrieve x, y, height and width
y, x, y2, x2 = region.bbox
height = y2 - y
width = x2 - x
axarr[axidx].add_patch(patches.Rectangle((x,y),width,height,linewidth=2,edgecolor='b',facecolor='none'))
# threshold predicted mask
comp = pred[:, :, 0] > 0.5
# apply connected components
comp = measure.label(comp)
# apply bounding boxes
predictionString = ''
for region in measure.regionprops(comp):
# retrieve x, y, height and width
y, x, y2, x2 = region.bbox
height = y2 - y
width = x2 - x
axarr[axidx].add_patch(patches.Rectangle((x,y),width,height,linewidth=2,edgecolor='r',facecolor='none'))
axidx += 1
plt.show()
# only plot one batch
break
folder = test_path
test_filenames = os.listdir(folder)
print('n test samples:', len(test_filenames))
# create test generator with predict flag set to True
test_gen = generator(folder, test_filenames, None, batch_size=25, image_size=128, shuffle=False, predict=True)
# create submission dictionary
submission_dict = {}
# loop through testset
for imgs, filenames in tqdm(test_gen):
# predict batch of images
preds = model1.predict(imgs)
# loop through batch
for pred, filename in zip(preds, filenames):
# resize predicted mask
pred = resize(pred, (1024, 1024), mode='reflect')
# threshold predicted mask
comp = pred[:, :, 0] > 0.5
# apply connected components
comp = measure.label(comp)
# apply bounding boxes
predictionString = ''
for region in measure.regionprops(comp):
# retrieve x, y, height and width
y, x, y2, x2 = region.bbox
height = y2 - y
width = x2 - x
# proxy for confidence score
conf = np.mean(pred[y:y+height, x:x+width])
# add to predictionString
predictionString += str(conf) + ' ' + str(x) + ' ' + str(y) + ' ' + str(width) + ' ' + str(height) + ' '
# add filename and predictionString to dictionary
filename = filename.split('.')[0]
submission_dict[filename] = predictionString
# stop if we've got them all
if len(submission_dict) >= len(test_filenames):
break
# save dictionary as csv file
sub = pd.DataFrame.from_dict(submission_dict,orient='index')
sub.index.names = ['patientId']
sub.columns = ['PredictionString']
sub.to_csv('submission.csv')
0%| | 0/121 [00:00<?, ?it/s]
n test samples: 3001
1%| | 1/121 [00:03<07:30, 3.75s/it] 2%|▏ | 2/121 [00:07<07:29, 3.78s/it] 2%|▏ | 3/121 [00:10<07:10, 3.65s/it] 3%|▎ | 4/121 [00:14<07:11, 3.69s/it] 4%|▍ | 5/121 [00:18<07:11, 3.72s/it] 5%|▍ | 6/121 [00:21<06:50, 3.57s/it] 6%|▌ | 7/121 [00:25<06:53, 3.63s/it] 7%|▋ | 8/121 [00:29<06:54, 3.67s/it] 7%|▋ | 9/121 [00:32<06:53, 3.69s/it] 8%|▊ | 10/121 [00:35<06:06, 3.30s/it] 9%|▉ | 11/121 [00:38<06:11, 3.38s/it] 10%|▉ | 12/121 [00:42<06:09, 3.39s/it] 11%|█ | 13/121 [00:45<05:53, 3.28s/it] 12%|█▏ | 14/121 [00:49<06:06, 3.42s/it] 12%|█▏ | 15/121 [00:52<06:12, 3.52s/it] 13%|█▎ | 16/121 [00:56<06:16, 3.58s/it] 14%|█▍ | 17/121 [01:00<06:17, 3.63s/it] 15%|█▍ | 18/121 [01:04<06:17, 3.67s/it] 16%|█▌ | 19/121 [01:07<06:16, 3.69s/it] 17%|█▋ | 20/121 [01:11<06:15, 3.72s/it] 17%|█▋ | 21/121 [01:15<06:14, 3.74s/it] 18%|█▊ | 22/121 [01:19<06:10, 3.74s/it] 19%|█▉ | 23/121 [01:22<06:06, 3.74s/it] 20%|█▉ | 24/121 [01:26<05:50, 3.61s/it] 21%|██ | 25/121 [01:29<05:50, 3.65s/it] 21%|██▏ | 26/121 [01:33<05:30, 3.48s/it] 22%|██▏ | 27/121 [01:36<05:35, 3.57s/it] 23%|██▎ | 28/121 [01:40<05:38, 3.64s/it] 24%|██▍ | 29/121 [01:44<05:39, 3.69s/it] 25%|██▍ | 30/121 [01:48<05:36, 3.70s/it] 26%|██▌ | 31/121 [01:51<05:16, 3.52s/it] 26%|██▋ | 32/121 [01:54<05:18, 3.57s/it] 27%|██▋ | 33/121 [01:57<04:42, 3.20s/it] 28%|██▊ | 34/121 [01:59<04:25, 3.05s/it] 29%|██▉ | 35/121 [02:02<04:10, 2.91s/it] 30%|██▉ | 36/121 [02:06<04:23, 3.10s/it] 31%|███ | 37/121 [02:09<04:25, 3.16s/it] 31%|███▏ | 38/121 [02:13<04:36, 3.33s/it] 32%|███▏ | 39/121 [02:16<04:31, 3.31s/it] 33%|███▎ | 40/121 [02:19<04:20, 3.21s/it] 34%|███▍ | 41/121 [02:21<03:57, 2.96s/it] 35%|███▍ | 42/121 [02:24<03:40, 2.79s/it] 36%|███▌ | 43/121 [02:26<03:27, 2.67s/it] 36%|███▋ | 44/121 [02:28<03:18, 2.58s/it] 37%|███▋ | 45/121 [02:32<03:38, 2.87s/it] 38%|███▊ | 46/121 [02:35<03:38, 2.91s/it] 39%|███▉ | 47/121 [02:37<03:23, 2.75s/it] 40%|███▉ | 48/121 [02:40<03:12, 2.64s/it] 40%|████ | 49/121 [02:43<03:18, 2.76s/it] 41%|████▏ | 50/121 [02:47<03:37, 3.06s/it] 42%|████▏ | 51/121 [02:49<03:24, 2.92s/it] 43%|████▎ | 52/121 [02:52<03:16, 2.84s/it] 44%|████▍ | 53/121 [02:56<03:32, 3.12s/it] 45%|████▍ | 54/121 [02:59<03:42, 3.31s/it] 45%|████▌ | 55/121 [03:03<03:47, 3.44s/it] 46%|████▋ | 56/121 [03:07<03:50, 3.55s/it] 47%|████▋ | 57/121 [03:11<03:51, 3.62s/it] 48%|████▊ | 58/121 [03:14<03:49, 3.64s/it] 49%|████▉ | 59/121 [03:17<03:35, 3.48s/it] 50%|████▉ | 60/121 [03:20<03:19, 3.28s/it] 50%|█████ | 61/121 [03:23<03:14, 3.24s/it] 51%|█████ | 62/121 [03:27<03:21, 3.42s/it] 52%|█████▏ | 63/121 [03:31<03:24, 3.52s/it] 53%|█████▎ | 64/121 [03:34<03:05, 3.25s/it] 54%|█████▎ | 65/121 [03:36<02:48, 3.01s/it] 55%|█████▍ | 66/121 [03:38<02:34, 2.81s/it] 55%|█████▌ | 67/121 [03:42<02:43, 3.03s/it] 56%|█████▌ | 68/121 [03:46<02:52, 3.25s/it] 57%|█████▋ | 69/121 [03:49<02:47, 3.23s/it] 58%|█████▊ | 70/121 [03:52<02:37, 3.09s/it] 59%|█████▊ | 71/121 [03:55<02:41, 3.24s/it] 60%|█████▉ | 72/121 [03:59<02:46, 3.40s/it] 60%|██████ | 73/121 [04:03<02:48, 3.51s/it] 61%|██████ | 74/121 [04:07<02:48, 3.59s/it] 62%|██████▏ | 75/121 [04:10<02:43, 3.55s/it] 63%|██████▎ | 76/121 [04:13<02:29, 3.33s/it] 64%|██████▎ | 77/121 [04:16<02:27, 3.36s/it] 64%|██████▍ | 78/121 [04:20<02:30, 3.49s/it] 65%|██████▌ | 79/121 [04:23<02:19, 3.32s/it] 66%|██████▌ | 80/121 [04:26<02:13, 3.26s/it] 67%|██████▋ | 81/121 [04:30<02:17, 3.43s/it] 68%|██████▊ | 82/121 [04:34<02:18, 3.54s/it] 69%|██████▊ | 83/121 [04:37<02:17, 3.61s/it] 69%|██████▉ | 84/121 [04:41<02:15, 3.66s/it] 70%|███████ | 85/121 [04:45<02:12, 3.68s/it] 71%|███████ | 86/121 [04:49<02:10, 3.72s/it] 72%|███████▏ | 87/121 [04:52<02:00, 3.54s/it] 73%|███████▎ | 88/121 [04:56<01:59, 3.61s/it] 74%|███████▎ | 89/121 [04:59<01:55, 3.61s/it] 74%|███████▍ | 90/121 [05:02<01:43, 3.34s/it] 75%|███████▌ | 91/121 [05:06<01:43, 3.46s/it] 76%|███████▌ | 92/121 [05:10<01:43, 3.56s/it] 77%|███████▋ | 93/121 [05:13<01:41, 3.63s/it] 78%|███████▊ | 94/121 [05:17<01:39, 3.69s/it] 79%|███████▊ | 95/121 [05:21<01:36, 3.72s/it] 79%|███████▉ | 96/121 [05:25<01:33, 3.74s/it] 80%|████████ | 97/121 [05:29<01:29, 3.74s/it] 81%|████████ | 98/121 [05:32<01:26, 3.77s/it] 82%|████████▏ | 99/121 [05:36<01:22, 3.77s/it] 83%|████████▎ | 100/121 [05:40<01:19, 3.78s/it] 83%|████████▎ | 101/121 [05:44<01:15, 3.78s/it] 84%|████████▍ | 102/121 [05:47<01:11, 3.77s/it] 85%|████████▌ | 103/121 [05:51<01:07, 3.77s/it] 86%|████████▌ | 104/121 [05:55<01:01, 3.63s/it] 87%|████████▋ | 105/121 [05:58<00:58, 3.66s/it] 88%|████████▊ | 106/121 [06:01<00:52, 3.51s/it] 88%|████████▊ | 107/121 [06:05<00:48, 3.43s/it] 89%|████████▉ | 108/121 [06:08<00:43, 3.37s/it] 90%|█████████ | 109/121 [06:11<00:39, 3.32s/it] 91%|█████████ | 110/121 [06:14<00:34, 3.10s/it] 92%|█████████▏| 111/121 [06:17<00:32, 3.30s/it] 93%|█████████▎| 112/121 [06:21<00:30, 3.44s/it] 93%|█████████▎| 113/121 [06:25<00:28, 3.55s/it] 94%|█████████▍| 114/121 [06:29<00:25, 3.61s/it] 95%|█████████▌| 115/121 [06:32<00:21, 3.65s/it] 96%|█████████▌| 116/121 [06:36<00:18, 3.68s/it] 97%|█████████▋| 117/121 [06:40<00:14, 3.65s/it] 98%|█████████▊| 118/121 [06:43<00:10, 3.59s/it] 98%|█████████▊| 119/121 [06:47<00:07, 3.63s/it] 99%|█████████▉| 120/121 [06:51<00:03, 3.66s/it]
Clone YOLOv3
!git clone https://github.com/pjreddie/darknet.git
os.chdir("/content/darknet")
Cloning into 'darknet'... remote: Enumerating objects: 5934, done. remote: Total 5934 (delta 0), reused 0 (delta 0), pack-reused 5934 Receiving objects: 100% (5934/5934), 6.35 MiB | 17.47 MiB/s, done. Resolving deltas: 100% (3926/3926), done.
!git clone https://github.com/ultralytics/yolov3 # clone repo
%cd yolov3
%pip install -qr requirements.txt # install dependencies
import torch
from IPython.display import Image, clear_output # to display images
clear_output()
print(f"Setup complete. Using torch {torch.__version__} ({torch.cuda.get_device_properties(0).name if torch.cuda.is_available() else 'CPU'})")
Setup complete. Using torch 1.8.1+cu101 (Tesla P100-PCIE-16GB)
DATA_DIR = '/content/'
train_dcm_dir = train_path
test_dcm_dir = test_path
img_dir = os.path.join(os.getcwd(), "images") # .jpg
label_dir = os.path.join(os.getcwd(), "labels") # .txt
metadata_dir = os.path.join(os.getcwd(), "metadata") # .txt
# YOLOv3 config file directory
cfg_dir = os.path.join(os.getcwd(), "cfg")
# YOLOv3 training checkpoints will be saved here
backup_dir = os.path.join(os.getcwd(), "backup")
for directory in [img_dir, label_dir, metadata_dir, cfg_dir, backup_dir]:
if os.path.isdir(directory):
continue
os.mkdir(directory)
!ls -shtl
total 556K 4.0K drwxr-xr-x 2 root root 4.0K Jun 13 06:00 backup 4.0K drwxr-xr-x 2 root root 4.0K Jun 13 06:00 cfg 4.0K drwxr-xr-x 2 root root 4.0K Jun 13 06:00 labels 4.0K drwxr-xr-x 2 root root 4.0K Jun 13 06:00 metadata 4.0K drwxr-xr-x 2 root root 4.0K Jun 13 06:00 images 4.0K drwxr-xr-x 6 root root 4.0K Jun 13 05:59 utils 4.0K drwxr-xr-x 2 root root 4.0K Jun 13 05:59 weights 388K -rw-r--r-- 1 root root 387K Jun 13 05:59 tutorial.ipynb 4.0K drwxr-xr-x 2 root root 4.0K Jun 13 05:59 models 4.0K -rwxr-xr-x 1 root root 677 Jun 13 05:59 requirements.txt 20K -rw-r--r-- 1 root root 17K Jun 13 05:59 test.py 36K -rw-r--r-- 1 root root 33K Jun 13 05:59 train.py 4.0K drwxr-xr-x 4 root root 4.0K Jun 13 05:59 data 12K -rw-r--r-- 1 root root 9.2K Jun 13 05:59 detect.py 8.0K -rw-r--r-- 1 root root 4.3K Jun 13 05:59 hubconf.py 36K -rw-r--r-- 1 root root 35K Jun 13 05:59 LICENSE 12K -rwxr-xr-x 1 root root 9.5K Jun 13 05:59 README.md 4.0K -rw-r--r-- 1 root root 1.8K Jun 13 05:59 Dockerfile
annots = pd.read_csv('/content/drive/MyDrive/Raw Data/stage_2_train_labels.csv')
annots.head()
| patientId | x | y | width | height | Target | |
|---|---|---|---|---|---|---|
| 0 | 0004cfab-14fd-4e49-80ba-63a80b6bddd6 | NaN | NaN | NaN | NaN | 0 |
| 1 | 00313ee0-9eaa-42f4-b0ab-c148ed3241cd | NaN | NaN | NaN | NaN | 0 |
| 2 | 00322d4d-1c29-4943-afc9-b6754be640eb | NaN | NaN | NaN | NaN | 0 |
| 3 | 003d8fa0-6bf1-40ed-b54c-ac657f8495c5 | NaN | NaN | NaN | NaN | 0 |
| 4 | 00436515-870c-4b36-a041-de91049b9ab4 | 264.0 | 152.0 | 213.0 | 379.0 | 1 |
Generate images and labels for training YOLOv3
def save_img_from_dcm(dcm_dir, img_dir, patient_id):
img_fp = os.path.join(img_dir, "{}.jpg".format(patient_id))
if os.path.exists(img_fp):
return
dcm_fp = os.path.join(dcm_dir, "{}.dcm".format(patient_id))
img_1ch = pydicom.read_file(dcm_fp).pixel_array
img_3ch = np.stack([img_1ch]*3, -1)
img_fp = os.path.join(img_dir, "{}.jpg".format(patient_id))
cv2.imwrite(img_fp, img_3ch)
def save_label_from_dcm(label_dir, patient_id, row=None):
# rsna defualt image size
img_size = 1024
label_fp = os.path.join(label_dir, "{}.txt".format(patient_id))
f = open(label_fp, "a")
if row is None:
f.close()
return
top_left_x = row[1]
top_left_y = row[2]
w = row[3]
h = row[4]
# 'r' means relative. 'c' means center.
rx = top_left_x/img_size
ry = top_left_y/img_size
rw = w/img_size
rh = h/img_size
rcx = rx+rw/2
rcy = ry+rh/2
line = "{} {} {} {} {}\n".format(0, rcx, rcy, rw, rh)
f.write(line)
f.close()
def save_yolov3_data_from_rsna(dcm_dir, img_dir, label_dir, annots):
for row in tqdm(annots.values):
patient_id = row[0]
img_fp = os.path.join(img_dir, "{}.jpg".format(patient_id))
if os.path.exists(img_fp):
save_label_from_dcm(label_dir, patient_id, row)
continue
target = row[5]
# Since collab kernel have samll volume (8GB ), I didn't contain files with no bbox here.
if target == 0:
continue
save_label_from_dcm(label_dir, patient_id, row)
save_img_from_dcm(dcm_dir, img_dir, patient_id)
save_yolov3_data_from_rsna(train_dcm_dir, img_dir, label_dir, annots)
0%| | 0/30227 [00:00<?, ?it/s] 0%| | 5/30227 [00:06<10:38:06, 1.27s/it] 0%| | 9/30227 [00:07<7:51:49, 1.07it/s] 0%| | 15/30227 [00:07<5:46:46, 1.45it/s] 0%| | 17/30227 [00:08<4:56:01, 1.70it/s] 0%| | 20/30227 [00:09<3:59:13, 2.10it/s] 0%| | 23/30227 [00:09<3:27:25, 2.43it/s] 0%| | 25/30227 [00:10<3:15:58, 2.57it/s] 0%| | 28/30227 [00:11<2:50:46, 2.95it/s] 0%| | 33/30227 [00:11<2:19:16, 3.61it/s] 0%| | 37/30227 [00:12<2:01:52, 4.13it/s] 0%| | 42/30227 [00:13<1:46:30, 4.72it/s] 0%| | 44/30227 [00:13<2:04:02, 4.06it/s] 0%| | 47/30227 [00:14<2:00:46, 4.16it/s] 0%| | 50/30227 [00:15<1:57:24, 4.28it/s] 0%| | 52/30227 [00:15<2:14:13, 3.75it/s] 0%| | 58/30227 [00:16<1:51:58, 4.49it/s] 0%| | 61/30227 [00:17<1:53:24, 4.43it/s] 0%| | 64/30227 [00:17<1:54:30, 4.39it/s] 0%| | 67/30227 [00:18<1:53:50, 4.42it/s] 0%| | 77/30227 [00:19<1:30:26, 5.56it/s] 0%| | 81/30227 [00:19<1:28:44, 5.66it/s] 0%| | 83/30227 [00:20<1:58:04, 4.25it/s] 0%| | 86/30227 [00:21<1:53:55, 4.41it/s] 0%| | 88/30227 [00:22<2:14:31, 3.73it/s] 0%| | 91/30227 [00:22<2:05:40, 4.00it/s] 0%| | 93/30227 [00:23<2:22:47, 3.52it/s] 0%| | 98/30227 [00:24<2:00:57, 4.15it/s] 0%| | 103/30227 [00:24<1:44:13, 4.82it/s] 0%| | 108/30227 [00:25<1:34:59, 5.28it/s] 0%| | 111/30227 [00:26<1:42:20, 4.90it/s] 0%| | 112/30227 [00:26<2:28:33, 3.38it/s] 0%| | 116/30227 [00:27<2:05:37, 3.99it/s] 0%| | 118/30227 [00:28<2:25:54, 3.44it/s] 0%| | 123/30227 [00:28<2:03:12, 4.07it/s] 0%| | 127/30227 [00:29<1:46:18, 4.72it/s] 0%| | 131/30227 [00:29<1:38:29, 5.09it/s] 0%| | 134/30227 [00:30<1:41:57, 4.92it/s] 0%| | 136/30227 [00:31<1:59:28, 4.20it/s] 0%| | 138/30227 [00:32<2:23:03, 3.51it/s] 0%| | 145/30227 [00:32<1:52:43, 4.45it/s] 1%| | 154/30227 [00:33<1:29:26, 5.60it/s] 1%| | 156/30227 [00:33<1:45:35, 4.75it/s] 1%| | 157/30227 [00:34<2:52:36, 2.90it/s] 1%| | 158/30227 [00:35<3:46:59, 2.21it/s] 1%| | 159/30227 [00:35<4:19:08, 1.93it/s] 1%| | 162/30227 [00:36<3:36:27, 2.32it/s] 1%| | 164/30227 [00:37<3:27:54, 2.41it/s] 1%| | 173/30227 [00:38<2:38:22, 3.16it/s] 1%| | 175/30227 [00:38<2:45:06, 3.03it/s] 1%| | 178/30227 [00:39<2:32:38, 3.28it/s] 1%| | 180/30227 [00:40<2:36:47, 3.19it/s] 1%| | 182/30227 [00:40<2:43:58, 3.05it/s] 1%| | 192/30227 [00:41<2:04:18, 4.03it/s] 1%| | 195/30227 [00:42<1:59:38, 4.18it/s] 1%| | 197/30227 [00:42<2:15:30, 3.69it/s] 1%| | 201/30227 [00:43<2:01:15, 4.13it/s] 1%| | 203/30227 [00:44<2:12:44, 3.77it/s] 1%| | 210/30227 [00:44<1:47:53, 4.64it/s] 1%| | 212/30227 [00:45<2:07:50, 3.91it/s] 1%| | 213/30227 [00:46<3:15:37, 2.56it/s] 1%| | 216/30227 [00:47<3:16:54, 2.54it/s] 1%| | 221/30227 [00:48<2:39:06, 3.14it/s] 1%| | 223/30227 [00:48<2:43:04, 3.07it/s] 1%| | 225/30227 [00:49<2:42:50, 3.07it/s] 1%| | 228/30227 [00:50<2:26:39, 3.41it/s] 1%| | 229/30227 [00:50<3:17:15, 2.53it/s] 1%| | 232/30227 [00:51<2:53:18, 2.88it/s] 1%| | 234/30227 [00:52<2:59:49, 2.78it/s] 1%| | 238/30227 [00:53<2:29:58, 3.33it/s] 1%| | 240/30227 [00:53<2:36:36, 3.19it/s] 1%| | 242/30227 [00:54<2:42:16, 3.08it/s] 1%| | 244/30227 [00:55<2:43:25, 3.06it/s] 1%| | 248/30227 [00:55<2:21:07, 3.54it/s] 1%| | 251/30227 [00:56<2:14:36, 3.71it/s] 1%| | 257/30227 [00:57<1:52:13, 4.45it/s] 1%| | 262/30227 [00:57<1:38:31, 5.07it/s] 1%| | 265/30227 [00:58<1:35:21, 5.24it/s] 1%| | 267/30227 [00:59<1:59:19, 4.18it/s] 1%| | 269/30227 [00:59<2:04:42, 4.00it/s] 1%| | 272/30227 [01:00<1:55:34, 4.32it/s] 1%| | 275/30227 [01:00<1:53:26, 4.40it/s] 1%| | 283/30227 [01:01<1:32:20, 5.40it/s] 1%| | 286/30227 [01:02<1:35:59, 5.20it/s] 1%| | 289/30227 [01:02<1:36:20, 5.18it/s] 1%| | 293/30227 [01:03<1:31:39, 5.44it/s] 1%| | 296/30227 [01:04<1:38:05, 5.09it/s] 1%| | 300/30227 [01:04<1:33:11, 5.35it/s] 1%| | 302/30227 [01:05<1:54:14, 4.37it/s] 1%| | 305/30227 [01:05<1:45:42, 4.72it/s] 1%| | 310/30227 [01:06<1:35:46, 5.21it/s] 1%| | 312/30227 [01:07<1:57:09, 4.26it/s] 1%| | 313/30227 [01:07<2:57:30, 2.81it/s] 1%| | 315/30227 [01:08<2:53:00, 2.88it/s] 1%| | 317/30227 [01:09<2:51:14, 2.91it/s] 1%| | 319/30227 [01:09<2:44:22, 3.03it/s] 1%| | 322/30227 [01:10<2:29:56, 3.32it/s] 1%| | 323/30227 [01:11<3:25:29, 2.43it/s] 1%| | 325/30227 [01:11<3:13:25, 2.58it/s] 1%| | 327/30227 [01:12<2:59:38, 2.77it/s] 1%| | 329/30227 [01:13<2:56:08, 2.83it/s] 1%| | 336/30227 [01:13<2:16:53, 3.64it/s] 1%| | 339/30227 [01:14<2:13:15, 3.74it/s] 1%| | 343/30227 [01:15<1:52:12, 4.44it/s] 1%| | 346/30227 [01:15<1:43:15, 4.82it/s] 1%| | 348/30227 [01:16<1:52:07, 4.44it/s] 1%| | 353/30227 [01:16<1:38:05, 5.08it/s] 1%| | 356/30227 [01:17<1:40:42, 4.94it/s] 1%| | 358/30227 [01:18<1:57:54, 4.22it/s] 1%| | 361/30227 [01:18<1:54:52, 4.33it/s] 1%| | 376/30227 [01:19<1:26:45, 5.73it/s] 1%|▏ | 378/30227 [01:20<1:52:17, 4.43it/s] 1%|▏ | 380/30227 [01:20<2:12:07, 3.76it/s] 1%|▏ | 381/30227 [01:21<3:17:37, 2.52it/s] 1%|▏ | 383/30227 [01:22<3:08:28, 2.64it/s] 1%|▏ | 391/30227 [01:22<2:22:47, 3.48it/s] 1%|▏ | 395/30227 [01:23<2:03:16, 4.03it/s] 1%|▏ | 401/30227 [01:24<1:45:22, 4.72it/s] 1%|▏ | 403/30227 [01:24<2:11:08, 3.79it/s] 1%|▏ | 419/30227 [01:25<1:37:08, 5.11it/s] 1%|▏ | 425/30227 [01:26<1:25:11, 5.83it/s] 1%|▏ | 431/30227 [01:26<1:19:17, 6.26it/s] 1%|▏ | 439/30227 [01:27<1:07:04, 7.40it/s] 1%|▏ | 444/30227 [01:28<1:02:11, 7.98it/s] 1%|▏ | 451/30227 [01:28<58:07, 8.54it/s] 1%|▏ | 453/30227 [01:29<1:32:18, 5.38it/s] 2%|▏ | 456/30227 [01:30<1:33:16, 5.32it/s] 2%|▏ | 461/30227 [01:30<1:21:47, 6.06it/s] 2%|▏ | 466/30227 [01:31<1:21:51, 6.06it/s] 2%|▏ | 470/30227 [01:32<1:23:04, 5.97it/s] 2%|▏ | 483/30227 [01:32<1:06:07, 7.50it/s] 2%|▏ | 488/30227 [01:33<1:07:29, 7.34it/s] 2%|▏ | 492/30227 [01:34<1:13:55, 6.70it/s] 2%|▏ | 494/30227 [01:35<1:49:23, 4.53it/s] 2%|▏ | 497/30227 [01:35<1:50:52, 4.47it/s] 2%|▏ | 499/30227 [01:36<2:01:54, 4.06it/s] 2%|▏ | 507/30227 [01:36<1:37:24, 5.08it/s] 2%|▏ | 509/30227 [01:37<1:49:29, 4.52it/s] 2%|▏ | 511/30227 [01:38<2:03:16, 4.02it/s] 2%|▏ | 515/30227 [01:38<1:49:42, 4.51it/s] 2%|▏ | 520/30227 [01:39<1:37:48, 5.06it/s] 2%|▏ | 522/30227 [01:40<2:00:00, 4.13it/s] 2%|▏ | 524/30227 [01:40<2:03:25, 4.01it/s] 2%|▏ | 527/30227 [01:41<1:59:53, 4.13it/s] 2%|▏ | 530/30227 [01:42<1:57:51, 4.20it/s] 2%|▏ | 532/30227 [01:42<2:08:56, 3.84it/s] 2%|▏ | 537/30227 [01:43<1:52:13, 4.41it/s] 2%|▏ | 542/30227 [01:44<1:39:10, 4.99it/s] 2%|▏ | 548/30227 [01:44<1:25:48, 5.76it/s] 2%|▏ | 556/30227 [01:45<1:12:58, 6.78it/s] 2%|▏ | 558/30227 [01:46<1:40:46, 4.91it/s] 2%|▏ | 563/30227 [01:46<1:30:00, 5.49it/s] 2%|▏ | 570/30227 [01:47<1:17:10, 6.40it/s] 2%|▏ | 574/30227 [01:48<1:22:34, 5.99it/s] 2%|▏ | 575/30227 [01:49<2:53:40, 2.85it/s] 2%|▏ | 577/30227 [01:49<2:53:21, 2.85it/s] 2%|▏ | 582/30227 [01:50<2:20:51, 3.51it/s] 2%|▏ | 589/30227 [01:51<1:52:41, 4.38it/s] 2%|▏ | 591/30227 [01:51<2:06:07, 3.92it/s] 2%|▏ | 599/30227 [01:52<1:41:04, 4.89it/s] 2%|▏ | 602/30227 [01:53<1:43:22, 4.78it/s] 2%|▏ | 604/30227 [01:53<2:02:08, 4.04it/s] 2%|▏ | 606/30227 [01:54<2:16:39, 3.61it/s] 2%|▏ | 608/30227 [01:55<2:25:27, 3.39it/s] 2%|▏ | 610/30227 [01:55<2:31:12, 3.26it/s] 2%|▏ | 614/30227 [01:56<2:08:46, 3.83it/s] 2%|▏ | 616/30227 [01:57<2:27:20, 3.35it/s] 2%|▏ | 618/30227 [01:57<2:30:58, 3.27it/s] 2%|▏ | 622/30227 [01:58<2:09:49, 3.80it/s] 2%|▏ | 624/30227 [01:59<2:22:05, 3.47it/s] 2%|▏ | 626/30227 [01:59<2:18:34, 3.56it/s] 2%|▏ | 627/30227 [02:00<3:12:18, 2.57it/s] 2%|▏ | 630/30227 [02:00<2:47:15, 2.95it/s] 2%|▏ | 632/30227 [02:01<2:40:09, 3.08it/s] 2%|▏ | 636/30227 [02:02<2:18:12, 3.57it/s] 2%|▏ | 642/30227 [02:02<1:52:39, 4.38it/s] 2%|▏ | 643/30227 [02:03<3:06:32, 2.64it/s] 2%|▏ | 645/30227 [02:04<2:57:35, 2.78it/s] 2%|▏ | 652/30227 [02:05<2:19:34, 3.53it/s] 2%|▏ | 654/30227 [02:05<2:24:09, 3.42it/s] 2%|▏ | 657/30227 [02:06<2:17:12, 3.59it/s] 2%|▏ | 663/30227 [02:07<1:52:55, 4.36it/s] 2%|▏ | 665/30227 [02:07<2:20:27, 3.51it/s] 2%|▏ | 670/30227 [02:08<1:57:13, 4.20it/s] 2%|▏ | 672/30227 [02:09<2:16:06, 3.62it/s] 2%|▏ | 676/30227 [02:10<2:03:06, 4.00it/s] 2%|▏ | 679/30227 [02:10<1:58:27, 4.16it/s] 2%|▏ | 685/30227 [02:11<1:37:18, 5.06it/s] 2%|▏ | 686/30227 [02:11<2:56:19, 2.79it/s] 2%|▏ | 687/30227 [02:12<3:46:51, 2.17it/s] 2%|▏ | 692/30227 [02:13<2:58:32, 2.76it/s] 2%|▏ | 693/30227 [02:14<3:44:41, 2.19it/s] 2%|▏ | 698/30227 [02:14<2:56:37, 2.79it/s] 2%|▏ | 705/30227 [02:15<2:18:01, 3.56it/s] 2%|▏ | 708/30227 [02:16<2:08:34, 3.83it/s] 2%|▏ | 711/30227 [02:16<2:01:35, 4.05it/s] 2%|▏ | 712/30227 [02:17<3:10:13, 2.59it/s] 2%|▏ | 714/30227 [02:18<3:03:39, 2.68it/s] 2%|▏ | 719/30227 [02:18<2:27:03, 3.34it/s] 2%|▏ | 721/30227 [02:19<2:36:59, 3.13it/s] 2%|▏ | 727/30227 [02:20<2:06:32, 3.89it/s] 2%|▏ | 731/30227 [02:20<1:54:35, 4.29it/s] 2%|▏ | 735/30227 [02:21<1:44:10, 4.72it/s] 2%|▏ | 736/30227 [02:22<2:50:39, 2.88it/s] 2%|▏ | 745/30227 [02:22<2:08:22, 3.83it/s] 2%|▏ | 746/30227 [02:23<3:08:32, 2.61it/s] 2%|▏ | 754/30227 [02:23<2:22:37, 3.44it/s] 3%|▎ | 758/30227 [02:24<2:07:00, 3.87it/s] 3%|▎ | 760/30227 [02:25<2:22:10, 3.45it/s] 3%|▎ | 767/30227 [02:26<1:56:21, 4.22it/s] 3%|▎ | 768/30227 [02:26<2:44:27, 2.99it/s] 3%|▎ | 772/30227 [02:27<2:20:14, 3.50it/s] 3%|▎ | 775/30227 [02:27<2:04:57, 3.93it/s] 3%|▎ | 781/30227 [02:28<1:43:18, 4.75it/s] 3%|▎ | 783/30227 [02:29<1:58:00, 4.16it/s] 3%|▎ | 788/30227 [02:29<1:42:32, 4.79it/s] 3%|▎ | 791/30227 [02:30<1:46:10, 4.62it/s] 3%|▎ | 793/30227 [02:31<2:00:32, 4.07it/s] 3%|▎ | 794/30227 [02:31<3:00:13, 2.72it/s] 3%|▎ | 802/30227 [02:32<2:17:34, 3.56it/s] 3%|▎ | 809/30227 [02:33<1:50:29, 4.44it/s] 3%|▎ | 812/30227 [02:33<1:44:20, 4.70it/s] 3%|▎ | 814/30227 [02:34<2:02:20, 4.01it/s] 3%|▎ | 816/30227 [02:35<2:21:03, 3.47it/s] 3%|▎ | 818/30227 [02:35<2:29:39, 3.28it/s] 3%|▎ | 824/30227 [02:36<2:02:15, 4.01it/s] 3%|▎ | 828/30227 [02:37<1:50:00, 4.45it/s] 3%|▎ | 834/30227 [02:38<1:35:37, 5.12it/s] 3%|▎ | 841/30227 [02:38<1:20:26, 6.09it/s] 3%|▎ | 842/30227 [02:39<2:41:07, 3.04it/s] 3%|▎ | 843/30227 [02:40<3:32:43, 2.30it/s] 3%|▎ | 847/30227 [02:40<2:52:44, 2.83it/s] 3%|▎ | 849/30227 [02:41<2:49:08, 2.89it/s] 3%|▎ | 851/30227 [02:42<2:47:05, 2.93it/s] 3%|▎ | 857/30227 [02:42<2:14:48, 3.63it/s] 3%|▎ | 860/30227 [02:43<2:06:37, 3.87it/s] 3%|▎ | 865/30227 [02:44<1:47:54, 4.53it/s] 3%|▎ | 879/30227 [02:44<1:22:40, 5.92it/s] 3%|▎ | 881/30227 [02:45<1:45:21, 4.64it/s] 3%|▎ | 882/30227 [02:46<2:52:27, 2.84it/s] 3%|▎ | 887/30227 [02:46<2:18:09, 3.54it/s] 3%|▎ | 889/30227 [02:47<2:25:32, 3.36it/s] 3%|▎ | 911/30227 [02:47<1:46:00, 4.61it/s] 3%|▎ | 925/30227 [02:48<1:21:02, 6.03it/s] 3%|▎ | 927/30227 [02:49<1:48:59, 4.48it/s] 3%|▎ | 939/30227 [02:49<1:24:23, 5.78it/s] 3%|▎ | 941/30227 [02:50<1:49:45, 4.45it/s] 3%|▎ | 948/30227 [02:51<1:34:58, 5.14it/s] 3%|▎ | 954/30227 [02:52<1:23:18, 5.86it/s] 3%|▎ | 964/30227 [02:52<1:08:01, 7.17it/s] 3%|▎ | 967/30227 [02:53<1:16:44, 6.35it/s] 3%|▎ | 973/30227 [02:54<1:10:12, 6.94it/s] 3%|▎ | 976/30227 [02:54<1:21:05, 6.01it/s] 3%|▎ | 978/30227 [02:55<1:48:35, 4.49it/s] 3%|▎ | 989/30227 [02:56<1:24:53, 5.74it/s] 3%|▎ | 990/30227 [02:57<2:55:29, 2.78it/s] 3%|▎ | 991/30227 [02:57<3:37:43, 2.24it/s] 3%|▎ | 993/30227 [02:58<3:20:11, 2.43it/s] 3%|▎ | 995/30227 [02:58<3:01:23, 2.69it/s] 3%|▎ | 998/30227 [02:59<2:38:06, 3.08it/s] 3%|▎ | 1009/30227 [03:00<1:59:33, 4.07it/s] 3%|▎ | 1011/30227 [03:00<2:21:26, 3.44it/s] 3%|▎ | 1016/30227 [03:01<1:59:30, 4.07it/s] 3%|▎ | 1017/30227 [03:02<3:08:48, 2.58it/s] 3%|▎ | 1019/30227 [03:03<3:08:29, 2.58it/s] 3%|▎ | 1027/30227 [03:03<2:26:07, 3.33it/s] 3%|▎ | 1029/30227 [03:04<2:21:28, 3.44it/s] 3%|▎ | 1031/30227 [03:05<2:19:49, 3.48it/s] 3%|▎ | 1034/30227 [03:05<2:15:16, 3.60it/s] 3%|▎ | 1037/30227 [03:06<2:08:52, 3.78it/s] 3%|▎ | 1044/30227 [03:07<1:44:46, 4.64it/s] 3%|▎ | 1052/30227 [03:07<1:27:07, 5.58it/s] 3%|▎ | 1056/30227 [03:08<1:24:06, 5.78it/s] 4%|▎ | 1061/30227 [03:09<1:18:44, 6.17it/s] 4%|▎ | 1071/30227 [03:09<1:04:50, 7.49it/s] 4%|▎ | 1074/30227 [03:10<1:21:15, 5.98it/s] 4%|▎ | 1075/30227 [03:11<2:40:18, 3.03it/s] 4%|▎ | 1079/30227 [03:12<2:16:29, 3.56it/s] 4%|▎ | 1080/30227 [03:12<3:20:30, 2.42it/s] 4%|▎ | 1085/30227 [03:13<2:40:03, 3.03it/s] 4%|▎ | 1087/30227 [03:14<2:39:21, 3.05it/s] 4%|▎ | 1090/30227 [03:14<2:25:36, 3.33it/s] 4%|▎ | 1092/30227 [03:15<2:33:46, 3.16it/s] 4%|▎ | 1098/30227 [03:16<2:00:27, 4.03it/s] 4%|▎ | 1105/30227 [03:16<1:38:05, 4.95it/s] 4%|▎ | 1107/30227 [03:17<1:47:01, 4.53it/s] 4%|▎ | 1108/30227 [03:17<2:39:43, 3.04it/s] 4%|▎ | 1109/30227 [03:18<3:33:07, 2.28it/s] 4%|▎ | 1112/30227 [03:19<3:04:34, 2.63it/s] 4%|▎ | 1114/30227 [03:19<3:02:02, 2.67it/s] 4%|▎ | 1118/30227 [03:20<2:31:26, 3.20it/s] 4%|▎ | 1120/30227 [03:21<2:37:29, 3.08it/s] 4%|▎ | 1123/30227 [03:22<2:22:31, 3.40it/s] 4%|▎ | 1126/30227 [03:22<2:12:29, 3.66it/s] 4%|▎ | 1131/30227 [03:23<1:53:21, 4.28it/s] 4%|▍ | 1134/30227 [03:23<1:45:38, 4.59it/s] 4%|▍ | 1138/30227 [03:24<1:38:30, 4.92it/s] 4%|▍ | 1143/30227 [03:25<1:30:56, 5.33it/s] 4%|▍ | 1144/30227 [03:26<2:53:17, 2.80it/s] 4%|▍ | 1145/30227 [03:26<3:38:48, 2.22it/s] 4%|▍ | 1149/30227 [03:27<2:56:42, 2.74it/s] 4%|▍ | 1151/30227 [03:28<2:49:55, 2.85it/s] 4%|▍ | 1153/30227 [03:28<2:39:44, 3.03it/s] 4%|▍ | 1158/30227 [03:29<2:13:02, 3.64it/s] 4%|▍ | 1161/30227 [03:30<2:04:11, 3.90it/s] 4%|▍ | 1167/30227 [03:30<1:42:28, 4.73it/s] 4%|▍ | 1170/30227 [03:31<1:45:02, 4.61it/s] 4%|▍ | 1172/30227 [03:32<2:04:28, 3.89it/s] 4%|▍ | 1175/30227 [03:32<2:02:28, 3.95it/s] 4%|▍ | 1177/30227 [03:34<2:58:30, 2.71it/s] 4%|▍ | 1178/30227 [03:34<3:42:33, 2.18it/s] 4%|▍ | 1179/30227 [03:35<4:08:55, 1.94it/s] 4%|▍ | 1182/30227 [03:36<3:25:22, 2.36it/s] 4%|▍ | 1184/30227 [03:36<3:13:56, 2.50it/s] 4%|▍ | 1187/30227 [03:37<2:41:29, 3.00it/s] 4%|▍ | 1189/30227 [03:37<2:43:29, 2.96it/s] 4%|▍ | 1191/30227 [03:38<2:37:19, 3.08it/s] 4%|▍ | 1193/30227 [03:39<2:28:52, 3.25it/s] 4%|▍ | 1195/30227 [03:39<2:31:35, 3.19it/s] 4%|▍ | 1197/30227 [03:40<2:36:12, 3.10it/s] 4%|▍ | 1200/30227 [03:41<2:21:56, 3.41it/s] 4%|▍ | 1202/30227 [03:41<2:30:59, 3.20it/s] 4%|▍ | 1203/30227 [03:42<3:17:13, 2.45it/s] 4%|▍ | 1208/30227 [03:43<2:38:47, 3.05it/s] 4%|▍ | 1211/30227 [03:43<2:19:02, 3.48it/s] 4%|▍ | 1218/30227 [03:44<1:48:56, 4.44it/s] 4%|▍ | 1223/30227 [03:44<1:34:58, 5.09it/s] 4%|▍ | 1230/30227 [03:45<1:23:08, 5.81it/s] 4%|▍ | 1233/30227 [03:46<1:30:37, 5.33it/s] 4%|▍ | 1235/30227 [03:47<1:49:31, 4.41it/s] 4%|▍ | 1242/30227 [03:47<1:31:11, 5.30it/s] 4%|▍ | 1244/30227 [03:48<1:53:05, 4.27it/s] 4%|▍ | 1245/30227 [03:49<2:54:14, 2.77it/s] 4%|▍ | 1247/30227 [03:49<2:47:01, 2.89it/s] 4%|▍ | 1251/30227 [03:50<2:24:13, 3.35it/s] 4%|▍ | 1253/30227 [03:51<2:29:46, 3.22it/s] 4%|▍ | 1256/30227 [03:51<2:16:18, 3.54it/s] 4%|▍ | 1260/30227 [03:52<1:58:53, 4.06it/s] 4%|▍ | 1263/30227 [03:53<1:55:03, 4.20it/s] 4%|▍ | 1270/30227 [03:53<1:34:25, 5.11it/s] 4%|▍ | 1274/30227 [03:54<1:31:32, 5.27it/s] 4%|▍ | 1276/30227 [03:55<1:54:26, 4.22it/s] 4%|▍ | 1280/30227 [03:55<1:42:45, 4.70it/s] 4%|▍ | 1283/30227 [03:56<1:43:15, 4.67it/s] 4%|▍ | 1286/30227 [03:57<1:51:06, 4.34it/s] 4%|▍ | 1290/30227 [03:57<1:40:35, 4.79it/s] 4%|▍ | 1292/30227 [03:58<1:52:15, 4.30it/s] 4%|▍ | 1296/30227 [03:59<1:47:20, 4.49it/s] 4%|▍ | 1305/30227 [04:00<1:27:37, 5.50it/s] 4%|▍ | 1306/30227 [04:00<2:26:12, 3.30it/s] 4%|▍ | 1309/30227 [04:01<2:17:57, 3.49it/s] 4%|▍ | 1313/30227 [04:01<1:57:00, 4.12it/s] 4%|▍ | 1317/30227 [04:02<1:46:29, 4.52it/s] 4%|▍ | 1320/30227 [04:03<1:46:07, 4.54it/s] 4%|▍ | 1322/30227 [04:04<2:13:44, 3.60it/s] 4%|▍ | 1324/30227 [04:04<2:17:46, 3.50it/s] 4%|▍ | 1326/30227 [04:05<2:30:18, 3.20it/s] 4%|▍ | 1327/30227 [04:06<3:25:56, 2.34it/s] 4%|▍ | 1334/30227 [04:06<2:38:51, 3.03it/s] 4%|▍ | 1335/30227 [04:07<3:16:24, 2.45it/s] 4%|▍ | 1338/30227 [04:08<2:46:26, 2.89it/s] 4%|▍ | 1341/30227 [04:08<2:30:26, 3.20it/s] 4%|▍ | 1342/30227 [04:09<3:18:47, 2.42it/s] 4%|▍ | 1349/30227 [04:10<2:35:01, 3.10it/s] 4%|▍ | 1357/30227 [04:10<2:00:42, 3.99it/s] 4%|▍ | 1359/30227 [04:11<2:12:04, 3.64it/s] 5%|▍ | 1361/30227 [04:12<2:17:45, 3.49it/s] 5%|▍ | 1365/30227 [04:12<1:59:29, 4.03it/s] 5%|▍ | 1366/30227 [04:13<3:04:04, 2.61it/s] 5%|▍ | 1369/30227 [04:14<2:42:56, 2.95it/s] 5%|▍ | 1373/30227 [04:14<2:19:24, 3.45it/s] 5%|▍ | 1379/30227 [04:15<1:55:10, 4.17it/s] 5%|▍ | 1380/30227 [04:16<2:50:50, 2.81it/s] 5%|▍ | 1395/30227 [04:16<2:05:52, 3.82it/s] 5%|▍ | 1399/30227 [04:17<1:50:54, 4.33it/s] 5%|▍ | 1401/30227 [04:18<2:02:53, 3.91it/s] 5%|▍ | 1403/30227 [04:18<2:11:21, 3.66it/s] 5%|▍ | 1405/30227 [04:19<2:23:49, 3.34it/s] 5%|▍ | 1407/30227 [04:20<2:28:35, 3.23it/s] 5%|▍ | 1411/30227 [04:20<2:08:18, 3.74it/s] 5%|▍ | 1412/30227 [04:21<3:22:38, 2.37it/s] 5%|▍ | 1419/30227 [04:22<2:37:58, 3.04it/s] 5%|▍ | 1426/30227 [04:23<2:05:46, 3.82it/s] 5%|▍ | 1436/30227 [04:23<1:38:57, 4.85it/s] 5%|▍ | 1440/30227 [04:24<1:38:28, 4.87it/s] 5%|▍ | 1443/30227 [04:25<1:43:48, 4.62it/s] 5%|▍ | 1444/30227 [04:26<2:46:09, 2.89it/s] 5%|▍ | 1448/30227 [04:26<2:19:36, 3.44it/s] 5%|▍ | 1453/30227 [04:27<1:56:32, 4.11it/s] 5%|▍ | 1455/30227 [04:28<2:08:12, 3.74it/s] 5%|▍ | 1459/30227 [04:28<1:52:52, 4.25it/s] 5%|▍ | 1463/30227 [04:29<1:37:37, 4.91it/s] 5%|▍ | 1465/30227 [04:29<2:03:20, 3.89it/s] 5%|▍ | 1467/30227 [04:30<2:15:10, 3.55it/s] 5%|▍ | 1471/30227 [04:31<1:59:53, 4.00it/s] 5%|▍ | 1481/30227 [04:32<1:33:20, 5.13it/s] 5%|▍ | 1482/30227 [04:32<2:36:52, 3.05it/s] 5%|▍ | 1487/30227 [04:33<2:06:11, 3.80it/s] 5%|▍ | 1489/30227 [04:33<2:20:53, 3.40it/s] 5%|▍ | 1492/30227 [04:34<2:10:25, 3.67it/s] 5%|▍ | 1502/30227 [04:35<1:38:22, 4.87it/s] 5%|▍ | 1505/30227 [04:35<1:42:17, 4.68it/s] 5%|▍ | 1508/30227 [04:36<1:41:39, 4.71it/s] 5%|▌ | 1513/30227 [04:37<1:31:41, 5.22it/s] 5%|▌ | 1514/30227 [04:37<2:36:23, 3.06it/s] 5%|▌ | 1515/30227 [04:38<3:37:02, 2.20it/s] 5%|▌ | 1516/30227 [04:39<4:27:11, 1.79it/s] 5%|▌ | 1527/30227 [04:39<3:15:24, 2.45it/s] 5%|▌ | 1537/30227 [04:40<2:26:33, 3.26it/s] 5%|▌ | 1542/30227 [04:41<2:04:01, 3.85it/s] 5%|▌ | 1545/30227 [04:42<2:03:09, 3.88it/s] 5%|▌ | 1547/30227 [04:42<2:15:13, 3.54it/s] 5%|▌ | 1551/30227 [04:43<1:57:34, 4.06it/s] 5%|▌ | 1555/30227 [04:44<1:45:43, 4.52it/s] 5%|▌ | 1557/30227 [04:44<2:00:22, 3.97it/s] 5%|▌ | 1561/30227 [04:45<1:49:15, 4.37it/s] 5%|▌ | 1563/30227 [04:46<2:05:51, 3.80it/s] 5%|▌ | 1567/30227 [04:46<1:55:47, 4.13it/s] 5%|▌ | 1568/30227 [04:47<2:40:36, 2.97it/s] 5%|▌ | 1581/30227 [04:48<1:59:48, 3.98it/s] 5%|▌ | 1584/30227 [04:48<1:53:39, 4.20it/s] 5%|▌ | 1585/30227 [04:49<2:50:51, 2.79it/s] 5%|▌ | 1588/30227 [04:50<2:28:49, 3.21it/s] 5%|▌ | 1589/30227 [04:50<3:15:24, 2.44it/s] 5%|▌ | 1594/30227 [04:51<2:36:36, 3.05it/s] 5%|▌ | 1595/30227 [04:52<3:18:49, 2.40it/s] 5%|▌ | 1600/30227 [04:52<2:38:37, 3.01it/s] 5%|▌ | 1606/30227 [04:53<2:03:20, 3.87it/s] 5%|▌ | 1607/30227 [04:53<2:59:31, 2.66it/s] 5%|▌ | 1609/30227 [04:54<2:54:36, 2.73it/s] 5%|▌ | 1610/30227 [04:55<3:34:15, 2.23it/s] 5%|▌ | 1612/30227 [04:55<3:20:54, 2.37it/s] 5%|▌ | 1615/30227 [04:56<2:52:04, 2.77it/s] 5%|▌ | 1623/30227 [04:57<2:12:36, 3.59it/s] 5%|▌ | 1624/30227 [04:57<2:45:13, 2.89it/s] 5%|▌ | 1625/30227 [04:58<3:38:51, 2.18it/s] 5%|▌ | 1627/30227 [04:59<3:21:21, 2.37it/s] 5%|▌ | 1629/30227 [04:59<3:06:42, 2.55it/s] 5%|▌ | 1631/30227 [05:00<3:07:10, 2.55it/s] 5%|▌ | 1633/30227 [05:01<3:00:55, 2.63it/s] 5%|▌ | 1642/30227 [05:01<2:17:11, 3.47it/s] 5%|▌ | 1646/30227 [05:02<1:59:47, 3.98it/s] 5%|▌ | 1650/30227 [05:03<1:47:37, 4.43it/s] 5%|▌ | 1652/30227 [05:03<2:01:47, 3.91it/s] 5%|▌ | 1655/30227 [05:04<1:58:40, 4.01it/s] 5%|▌ | 1662/30227 [05:05<1:36:57, 4.91it/s] 6%|▌ | 1666/30227 [05:05<1:30:49, 5.24it/s] 6%|▌ | 1668/30227 [05:06<1:49:16, 4.36it/s] 6%|▌ | 1670/30227 [05:07<2:11:45, 3.61it/s] 6%|▌ | 1671/30227 [05:08<3:04:39, 2.58it/s] 6%|▌ | 1677/30227 [05:08<2:22:31, 3.34it/s] 6%|▌ | 1679/30227 [05:09<2:21:06, 3.37it/s] 6%|▌ | 1681/30227 [05:09<2:24:29, 3.29it/s] 6%|▌ | 1685/30227 [05:10<2:00:38, 3.94it/s] 6%|▌ | 1688/30227 [05:11<1:57:38, 4.04it/s] 6%|▌ | 1696/30227 [05:11<1:34:34, 5.03it/s] 6%|▌ | 1704/30227 [05:12<1:17:43, 6.12it/s] 6%|▌ | 1705/30227 [05:13<2:47:24, 2.84it/s] 6%|▌ | 1706/30227 [05:13<3:30:22, 2.26it/s] 6%|▌ | 1707/30227 [05:14<3:47:54, 2.09it/s] 6%|▌ | 1711/30227 [05:14<2:57:44, 2.67it/s] 6%|▌ | 1714/30227 [05:15<2:35:00, 3.07it/s] 6%|▌ | 1723/30227 [05:16<1:59:34, 3.97it/s] 6%|▌ | 1729/30227 [05:16<1:39:43, 4.76it/s] 6%|▌ | 1734/30227 [05:17<1:29:36, 5.30it/s] 6%|▌ | 1740/30227 [05:18<1:18:24, 6.06it/s] 6%|▌ | 1751/30227 [05:18<1:03:38, 7.46it/s] 6%|▌ | 1753/30227 [05:19<1:34:20, 5.03it/s] 6%|▌ | 1755/30227 [05:20<1:44:53, 4.52it/s] 6%|▌ | 1757/30227 [05:20<1:58:59, 3.99it/s] 6%|▌ | 1758/30227 [05:21<3:10:43, 2.49it/s] 6%|▌ | 1759/30227 [05:22<3:58:52, 1.99it/s] 6%|▌ | 1761/30227 [05:23<3:35:17, 2.20it/s] 6%|▌ | 1763/30227 [05:23<3:18:04, 2.40it/s] 6%|▌ | 1767/30227 [05:24<2:41:46, 2.93it/s] 6%|▌ | 1783/30227 [05:25<2:00:08, 3.95it/s] 6%|▌ | 1788/30227 [05:25<1:43:54, 4.56it/s] 6%|▌ | 1789/30227 [05:26<2:54:15, 2.72it/s] 6%|▌ | 1791/30227 [05:27<2:53:12, 2.74it/s] 6%|▌ | 1792/30227 [05:27<3:41:24, 2.14it/s] 6%|▌ | 1798/30227 [05:28<2:48:38, 2.81it/s] 6%|▌ | 1801/30227 [05:29<2:39:03, 2.98it/s] 6%|▌ | 1802/30227 [05:30<3:28:04, 2.28it/s] 6%|▌ | 1805/30227 [05:30<2:58:49, 2.65it/s] 6%|▌ | 1806/30227 [05:31<3:38:15, 2.17it/s] 6%|▌ | 1810/30227 [05:32<2:57:37, 2.67it/s] 6%|▌ | 1825/30227 [05:32<2:10:52, 3.62it/s] 6%|▌ | 1826/30227 [05:33<3:02:49, 2.59it/s] 6%|▌ | 1830/30227 [05:34<2:32:09, 3.11it/s] 6%|▌ | 1832/30227 [05:34<2:33:02, 3.09it/s] 6%|▌ | 1833/30227 [05:35<3:30:50, 2.24it/s] 6%|▌ | 1835/30227 [05:36<3:14:54, 2.43it/s] 6%|▌ | 1838/30227 [05:36<2:47:10, 2.83it/s] 6%|▌ | 1839/30227 [05:37<3:34:23, 2.21it/s] 6%|▌ | 1842/30227 [05:38<3:03:48, 2.57it/s] 6%|▌ | 1844/30227 [05:39<3:07:07, 2.53it/s] 6%|▌ | 1846/30227 [05:39<3:05:38, 2.55it/s] 6%|▌ | 1850/30227 [05:40<2:37:27, 3.00it/s] 6%|▌ | 1854/30227 [05:41<2:18:27, 3.42it/s] 6%|▌ | 1859/30227 [05:42<1:56:42, 4.05it/s] 6%|▌ | 1865/30227 [05:42<1:37:44, 4.84it/s] 6%|▌ | 1868/30227 [05:43<1:35:13, 4.96it/s] 6%|▌ | 1872/30227 [05:44<1:31:00, 5.19it/s] 6%|▌ | 1873/30227 [05:44<2:38:02, 2.99it/s] 6%|▌ | 1886/30227 [05:45<1:57:52, 4.01it/s] 6%|▌ | 1889/30227 [05:46<1:53:28, 4.16it/s] 6%|▋ | 1892/30227 [05:46<1:47:49, 4.38it/s] 6%|▋ | 1895/30227 [05:47<1:44:13, 4.53it/s] 6%|▋ | 1897/30227 [05:47<1:54:46, 4.11it/s] 6%|▋ | 1900/30227 [05:48<1:52:43, 4.19it/s] 6%|▋ | 1904/30227 [05:49<1:44:22, 4.52it/s] 6%|▋ | 1907/30227 [05:49<1:46:03, 4.45it/s] 6%|▋ | 1912/30227 [05:50<1:32:07, 5.12it/s] 6%|▋ | 1918/30227 [05:51<1:20:36, 5.85it/s] 6%|▋ | 1919/30227 [05:51<2:33:37, 3.07it/s] 6%|▋ | 1923/30227 [05:52<2:11:56, 3.58it/s] 6%|▋ | 1926/30227 [05:53<1:59:25, 3.95it/s] 6%|▋ | 1929/30227 [05:53<1:57:14, 4.02it/s] 6%|▋ | 1932/30227 [05:54<1:53:21, 4.16it/s] 6%|▋ | 1933/30227 [05:55<2:48:40, 2.80it/s] 6%|▋ | 1936/30227 [05:55<2:24:36, 3.26it/s] 6%|▋ | 1939/30227 [05:56<2:13:50, 3.52it/s] 6%|▋ | 1942/30227 [05:57<2:04:35, 3.78it/s] 6%|▋ | 1944/30227 [05:57<2:14:45, 3.50it/s] 6%|▋ | 1947/30227 [05:58<2:04:52, 3.77it/s] 6%|▋ | 1953/30227 [05:59<1:42:58, 4.58it/s] 6%|▋ | 1956/30227 [05:59<1:49:01, 4.32it/s] 6%|▋ | 1958/30227 [06:00<2:05:52, 3.74it/s] 6%|▋ | 1960/30227 [06:01<2:22:37, 3.30it/s] 6%|▋ | 1963/30227 [06:02<2:10:34, 3.61it/s] 7%|▋ | 1967/30227 [06:02<1:57:51, 4.00it/s] 7%|▋ | 1972/30227 [06:03<1:40:47, 4.67it/s] 7%|▋ | 1974/30227 [06:04<1:57:17, 4.01it/s] 7%|▋ | 1975/30227 [06:04<2:53:08, 2.72it/s] 7%|▋ | 1976/30227 [06:05<3:39:28, 2.15it/s] 7%|▋ | 1979/30227 [06:06<3:00:14, 2.61it/s] 7%|▋ | 1980/30227 [06:06<3:37:46, 2.16it/s] 7%|▋ | 1982/30227 [06:07<3:09:08, 2.49it/s] 7%|▋ | 1990/30227 [06:07<2:24:03, 3.27it/s] 7%|▋ | 1994/30227 [06:08<2:06:23, 3.72it/s] 7%|▋ | 2000/30227 [06:09<1:53:05, 4.16it/s] 7%|▋ | 2002/30227 [06:10<2:06:31, 3.72it/s] 7%|▋ | 2004/30227 [06:10<2:17:29, 3.42it/s] 7%|▋ | 2007/30227 [06:11<2:14:32, 3.50it/s] 7%|▋ | 2013/30227 [06:12<1:51:04, 4.23it/s] 7%|▋ | 2017/30227 [06:13<1:44:45, 4.49it/s] 7%|▋ | 2020/30227 [06:13<1:43:53, 4.52it/s] 7%|▋ | 2022/30227 [06:14<1:56:14, 4.04it/s] 7%|▋ | 2024/30227 [06:15<2:14:00, 3.51it/s] 7%|▋ | 2033/30227 [06:16<1:45:50, 4.44it/s] 7%|▋ | 2034/30227 [06:16<2:46:50, 2.82it/s] 7%|▋ | 2037/30227 [06:17<2:27:44, 3.18it/s] 7%|▋ | 2041/30227 [06:18<2:07:55, 3.67it/s] 7%|▋ | 2046/30227 [06:18<1:48:49, 4.32it/s] 7%|▋ | 2049/30227 [06:19<1:52:10, 4.19it/s] 7%|▋ | 2052/30227 [06:20<1:49:46, 4.28it/s] 7%|▋ | 2054/30227 [06:20<2:03:26, 3.80it/s] 7%|▋ | 2055/30227 [06:21<2:53:12, 2.71it/s] 7%|▋ | 2056/30227 [06:22<3:48:30, 2.05it/s] 7%|▋ | 2063/30227 [06:22<2:53:55, 2.70it/s] 7%|▋ | 2065/30227 [06:23<2:53:29, 2.71it/s] 7%|▋ | 2068/30227 [06:24<2:33:24, 3.06it/s] 7%|▋ | 2070/30227 [06:25<2:35:35, 3.02it/s] 7%|▋ | 2071/30227 [06:25<3:31:22, 2.22it/s] 7%|▋ | 2082/30227 [06:26<2:37:51, 2.97it/s] 7%|▋ | 2083/30227 [06:27<3:27:18, 2.26it/s] 7%|▋ | 2093/30227 [06:27<2:34:21, 3.04it/s] 7%|▋ | 2098/30227 [06:28<2:07:04, 3.69it/s] 7%|▋ | 2102/30227 [06:29<1:51:23, 4.21it/s] 7%|▋ | 2103/30227 [06:29<2:48:10, 2.79it/s] 7%|▋ | 2107/30227 [06:30<2:19:38, 3.36it/s] 7%|▋ | 2112/30227 [06:31<1:56:34, 4.02it/s] 7%|▋ | 2115/30227 [06:31<1:51:30, 4.20it/s] 7%|▋ | 2119/30227 [06:32<1:40:56, 4.64it/s] 7%|▋ | 2124/30227 [06:33<1:28:40, 5.28it/s] 7%|▋ | 2129/30227 [06:33<1:21:12, 5.77it/s] 7%|▋ | 2150/30227 [06:34<1:00:42, 7.71it/s] 7%|▋ | 2159/30227 [06:34<52:12, 8.96it/s] 7%|▋ | 2162/30227 [06:35<1:06:53, 6.99it/s] 7%|▋ | 2164/30227 [06:36<1:35:02, 4.92it/s] 7%|▋ | 2165/30227 [06:36<2:40:51, 2.91it/s] 7%|▋ | 2166/30227 [06:37<3:43:21, 2.09it/s] 7%|▋ | 2169/30227 [06:38<3:08:22, 2.48it/s] 7%|▋ | 2170/30227 [06:39<3:48:44, 2.04it/s] 7%|▋ | 2172/30227 [06:39<3:26:34, 2.26it/s] 7%|▋ | 2179/30227 [06:40<2:35:27, 3.01it/s] 7%|▋ | 2185/30227 [06:40<2:04:24, 3.76it/s] 7%|▋ | 2186/30227 [06:41<3:01:49, 2.57it/s] 7%|▋ | 2190/30227 [06:42<2:30:37, 3.10it/s] 7%|▋ | 2191/30227 [06:43<3:26:10, 2.27it/s] 7%|▋ | 2198/30227 [06:43<2:38:15, 2.95it/s] 7%|▋ | 2201/30227 [06:44<2:20:59, 3.31it/s] 7%|▋ | 2203/30227 [06:45<2:25:57, 3.20it/s] 7%|▋ | 2209/30227 [06:45<1:55:30, 4.04it/s] 7%|▋ | 2211/30227 [06:46<2:13:10, 3.51it/s] 7%|▋ | 2214/30227 [06:47<2:04:08, 3.76it/s] 7%|▋ | 2220/30227 [06:47<1:43:23, 4.51it/s] 7%|▋ | 2222/30227 [06:48<2:00:30, 3.87it/s] 7%|▋ | 2224/30227 [06:49<2:09:40, 3.60it/s] 7%|▋ | 2225/30227 [06:49<3:08:26, 2.48it/s] 7%|▋ | 2226/30227 [06:50<3:35:33, 2.16it/s] 7%|▋ | 2227/30227 [06:51<4:06:33, 1.89it/s] 7%|▋ | 2235/30227 [06:51<3:04:41, 2.53it/s] 7%|▋ | 2238/30227 [06:52<2:44:59, 2.83it/s] 7%|▋ | 2241/30227 [06:53<2:20:07, 3.33it/s] 7%|▋ | 2242/30227 [06:53<3:14:24, 2.40it/s] 7%|▋ | 2246/30227 [06:54<2:38:19, 2.95it/s] 7%|▋ | 2248/30227 [06:55<2:43:19, 2.86it/s] 7%|▋ | 2251/30227 [06:55<2:27:39, 3.16it/s] 7%|▋ | 2252/30227 [06:56<3:17:06, 2.37it/s] 7%|▋ | 2256/30227 [06:57<2:41:29, 2.89it/s] 7%|▋ | 2262/30227 [06:58<2:11:43, 3.54it/s] 8%|▊ | 2275/30227 [06:58<1:38:17, 4.74it/s] 8%|▊ | 2277/30227 [06:59<1:54:45, 4.06it/s] 8%|▊ | 2280/30227 [06:59<1:45:40, 4.41it/s] 8%|▊ | 2284/30227 [07:00<1:39:08, 4.70it/s] 8%|▊ | 2286/30227 [07:01<1:55:45, 4.02it/s] 8%|▊ | 2289/30227 [07:01<1:58:35, 3.93it/s] 8%|▊ | 2294/30227 [07:02<1:44:32, 4.45it/s] 8%|▊ | 2296/30227 [07:03<2:04:14, 3.75it/s] 8%|▊ | 2300/30227 [07:04<1:49:50, 4.24it/s] 8%|▊ | 2302/30227 [07:04<2:10:42, 3.56it/s] 8%|▊ | 2303/30227 [07:05<2:52:28, 2.70it/s] 8%|▊ | 2309/30227 [07:06<2:16:50, 3.40it/s] 8%|▊ | 2312/30227 [07:06<2:06:05, 3.69it/s] 8%|▊ | 2316/30227 [07:07<1:52:50, 4.12it/s] 8%|▊ | 2320/30227 [07:08<1:43:18, 4.50it/s] 8%|▊ | 2324/30227 [07:08<1:38:05, 4.74it/s] 8%|▊ | 2328/30227 [07:09<1:28:57, 5.23it/s] 8%|▊ | 2330/30227 [07:10<1:47:16, 4.33it/s] 8%|▊ | 2334/30227 [07:10<1:38:55, 4.70it/s] 8%|▊ | 2336/30227 [07:11<1:48:09, 4.30it/s] 8%|▊ | 2340/30227 [07:12<1:38:48, 4.70it/s] 8%|▊ | 2342/30227 [07:12<1:56:03, 4.00it/s] 8%|▊ | 2347/30227 [07:13<1:39:24, 4.67it/s] 8%|▊ | 2349/30227 [07:14<1:50:02, 4.22it/s] 8%|▊ | 2352/30227 [07:14<1:44:34, 4.44it/s] 8%|▊ | 2356/30227 [07:15<1:35:38, 4.86it/s] 8%|▊ | 2358/30227 [07:15<1:57:39, 3.95it/s] 8%|▊ | 2359/30227 [07:16<3:05:55, 2.50it/s] 8%|▊ | 2361/30227 [07:17<2:50:25, 2.73it/s] 8%|▊ | 2362/30227 [07:18<3:41:27, 2.10it/s] 8%|▊ | 2366/30227 [07:18<2:53:29, 2.68it/s] 8%|▊ | 2367/30227 [07:19<3:15:18, 2.38it/s] 8%|▊ | 2371/30227 [07:19<2:39:02, 2.92it/s] 8%|▊ | 2373/30227 [07:20<2:36:20, 2.97it/s] 8%|▊ | 2376/30227 [07:21<2:21:52, 3.27it/s] 8%|▊ | 2383/30227 [07:21<1:52:29, 4.13it/s] 8%|▊ | 2387/30227 [07:22<1:59:57, 3.87it/s] 8%|▊ | 2389/30227 [07:23<2:12:18, 3.51it/s] 8%|▊ | 2390/30227 [07:24<3:02:49, 2.54it/s] 8%|▊ | 2391/30227 [07:24<3:38:28, 2.12it/s] 8%|▊ | 2393/30227 [07:25<3:21:54, 2.30it/s] 8%|▊ | 2395/30227 [07:26<3:13:38, 2.40it/s] 8%|▊ | 2397/30227 [07:27<3:01:18, 2.56it/s] 8%|▊ | 2402/30227 [07:27<2:26:33, 3.16it/s] 8%|▊ | 2405/30227 [07:28<2:15:43, 3.42it/s] 8%|▊ | 2407/30227 [07:29<2:19:37, 3.32it/s] 8%|▊ | 2409/30227 [07:29<2:27:07, 3.15it/s] 8%|▊ | 2412/30227 [07:30<2:12:30, 3.50it/s] 8%|▊ | 2413/30227 [07:31<3:22:43, 2.29it/s] 8%|▊ | 2415/30227 [07:31<3:03:52, 2.52it/s] 8%|▊ | 2421/30227 [07:32<2:23:39, 3.23it/s] 8%|▊ | 2424/30227 [07:33<2:12:17, 3.50it/s] 8%|▊ | 2428/30227 [07:33<1:55:17, 4.02it/s] 8%|▊ | 2435/30227 [07:34<1:34:39, 4.89it/s] 8%|▊ | 2437/30227 [07:35<1:53:53, 4.07it/s] 8%|▊ | 2440/30227 [07:35<1:54:14, 4.05it/s] 8%|▊ | 2443/30227 [07:36<1:45:35, 4.39it/s] 8%|▊ | 2446/30227 [07:37<1:47:22, 4.31it/s] 8%|▊ | 2448/30227 [07:37<2:05:06, 3.70it/s] 8%|▊ | 2451/30227 [07:38<1:54:02, 4.06it/s] 8%|▊ | 2453/30227 [07:39<2:08:46, 3.59it/s] 8%|▊ | 2454/30227 [07:39<3:07:15, 2.47it/s] 8%|▊ | 2456/30227 [07:40<2:59:33, 2.58it/s] 8%|▊ | 2458/30227 [07:41<2:59:40, 2.58it/s] 8%|▊ | 2461/30227 [07:42<2:35:59, 2.97it/s] 8%|▊ | 2463/30227 [07:42<2:33:41, 3.01it/s] 8%|▊ | 2466/30227 [07:43<2:13:45, 3.46it/s] 8%|▊ | 2470/30227 [07:43<1:57:01, 3.95it/s] 8%|▊ | 2472/30227 [07:44<2:17:57, 3.35it/s] 8%|▊ | 2474/30227 [07:45<2:13:18, 3.47it/s] 8%|▊ | 2475/30227 [07:45<3:03:31, 2.52it/s] 8%|▊ | 2477/30227 [07:46<2:46:15, 2.78it/s] 8%|▊ | 2488/30227 [07:47<2:04:44, 3.71it/s] 8%|▊ | 2494/30227 [07:47<1:43:38, 4.46it/s] 8%|▊ | 2496/30227 [07:48<1:59:04, 3.88it/s] 8%|▊ | 2500/30227 [07:49<1:46:47, 4.33it/s] 8%|▊ | 2502/30227 [07:49<2:01:20, 3.81it/s] 8%|▊ | 2508/30227 [07:50<1:39:35, 4.64it/s] 8%|▊ | 2513/30227 [07:51<1:29:13, 5.18it/s] 8%|▊ | 2516/30227 [07:51<1:32:49, 4.98it/s] 8%|▊ | 2518/30227 [07:52<1:50:38, 4.17it/s] 8%|▊ | 2526/30227 [07:53<1:30:09, 5.12it/s] 8%|▊ | 2528/30227 [07:53<1:52:42, 4.10it/s] 8%|▊ | 2531/30227 [07:54<1:57:25, 3.93it/s] 8%|▊ | 2532/30227 [07:55<2:53:15, 2.66it/s] 8%|▊ | 2533/30227 [07:56<3:34:15, 2.15it/s] 8%|▊ | 2536/30227 [07:56<2:57:56, 2.59it/s] 8%|▊ | 2539/30227 [07:57<2:34:27, 2.99it/s] 8%|▊ | 2544/30227 [07:58<2:06:26, 3.65it/s] 8%|▊ | 2546/30227 [07:58<2:05:05, 3.69it/s] 8%|▊ | 2552/30227 [07:59<1:43:11, 4.47it/s] 8%|▊ | 2557/30227 [08:00<1:34:04, 4.90it/s] 8%|▊ | 2560/30227 [08:00<1:38:16, 4.69it/s] 8%|▊ | 2566/30227 [08:01<1:23:53, 5.50it/s] 8%|▊ | 2568/30227 [08:02<1:44:03, 4.43it/s] 9%|▊ | 2570/30227 [08:02<1:58:33, 3.89it/s] 9%|▊ | 2571/30227 [08:03<2:51:52, 2.68it/s] 9%|▊ | 2574/30227 [08:04<2:32:08, 3.03it/s] 9%|▊ | 2576/30227 [08:04<2:36:13, 2.95it/s] 9%|▊ | 2578/30227 [08:05<2:36:28, 2.95it/s] 9%|▊ | 2584/30227 [08:06<2:02:32, 3.76it/s] 9%|▊ | 2585/30227 [08:06<2:43:44, 2.81it/s] 9%|▊ | 2586/30227 [08:07<3:25:12, 2.24it/s] 9%|▊ | 2589/30227 [08:07<2:54:29, 2.64it/s] 9%|▊ | 2592/30227 [08:08<2:33:58, 2.99it/s] 9%|▊ | 2595/30227 [08:09<2:17:47, 3.34it/s] 9%|▊ | 2596/30227 [08:10<3:24:36, 2.25it/s] 9%|▊ | 2600/30227 [08:10<2:47:08, 2.75it/s] 9%|▊ | 2606/30227 [08:11<2:11:52, 3.49it/s] 9%|▊ | 2608/30227 [08:11<2:10:06, 3.54it/s] 9%|▊ | 2611/30227 [08:12<2:01:19, 3.79it/s] 9%|▊ | 2612/30227 [08:13<2:57:22, 2.59it/s] 9%|▊ | 2614/30227 [08:13<2:52:17, 2.67it/s] 9%|▊ | 2618/30227 [08:14<2:27:14, 3.13it/s] 9%|▊ | 2620/30227 [08:15<2:30:53, 3.05it/s] 9%|▊ | 2630/30227 [08:16<1:55:18, 3.99it/s] 9%|▊ | 2635/30227 [08:16<1:36:30, 4.77it/s] 9%|▊ | 2637/30227 [08:17<1:58:27, 3.88it/s] 9%|▊ | 2639/30227 [08:18<2:06:39, 3.63it/s] 9%|▊ | 2643/30227 [08:18<1:53:46, 4.04it/s] 9%|▉ | 2648/30227 [08:19<1:37:10, 4.73it/s] 9%|▉ | 2652/30227 [08:20<1:34:46, 4.85it/s] 9%|▉ | 2655/30227 [08:20<1:39:32, 4.62it/s] 9%|▉ | 2662/30227 [08:21<1:23:59, 5.47it/s] 9%|▉ | 2664/30227 [08:22<1:36:25, 4.76it/s] 9%|▉ | 2667/30227 [08:22<1:39:13, 4.63it/s] 9%|▉ | 2670/30227 [08:23<1:42:08, 4.50it/s] 9%|▉ | 2673/30227 [08:24<1:38:26, 4.67it/s] 9%|▉ | 2676/30227 [08:24<1:39:03, 4.64it/s] 9%|▉ | 2679/30227 [08:25<1:39:47, 4.60it/s] 9%|▉ | 2681/30227 [08:26<1:58:45, 3.87it/s] 9%|▉ | 2684/30227 [08:26<1:50:46, 4.14it/s] 9%|▉ | 2687/30227 [08:27<1:48:45, 4.22it/s] 9%|▉ | 2691/30227 [08:28<1:41:01, 4.54it/s] 9%|▉ | 2694/30227 [08:28<1:42:06, 4.49it/s] 9%|▉ | 2697/30227 [08:29<1:44:51, 4.38it/s] 9%|▉ | 2698/30227 [08:30<2:31:02, 3.04it/s] 9%|▉ | 2701/30227 [08:30<2:19:39, 3.28it/s] 9%|▉ | 2706/30227 [08:31<1:59:23, 3.84it/s] 9%|▉ | 2707/30227 [08:32<3:11:12, 2.40it/s] 9%|▉ | 2715/30227 [08:33<2:23:47, 3.19it/s] 9%|▉ | 2717/30227 [08:33<2:34:33, 2.97it/s] 9%|▉ | 2722/30227 [08:34<2:10:18, 3.52it/s] 9%|▉ | 2725/30227 [08:35<2:04:41, 3.68it/s] 9%|▉ | 2727/30227 [08:36<2:13:07, 3.44it/s] 9%|▉ | 2733/30227 [08:36<1:47:34, 4.26it/s] 9%|▉ | 2734/30227 [08:37<2:45:17, 2.77it/s] 9%|▉ | 2736/30227 [08:37<2:31:50, 3.02it/s] 9%|▉ | 2741/30227 [08:38<2:05:42, 3.64it/s] 9%|▉ | 2745/30227 [08:39<1:51:41, 4.10it/s] 9%|▉ | 2747/30227 [08:39<2:02:45, 3.73it/s] 9%|▉ | 2749/30227 [08:40<2:16:22, 3.36it/s] 9%|▉ | 2752/30227 [08:41<2:08:36, 3.56it/s] 9%|▉ | 2758/30227 [08:42<1:47:01, 4.28it/s] 9%|▉ | 2763/30227 [08:42<1:36:43, 4.73it/s] 9%|▉ | 2766/30227 [08:43<1:39:25, 4.60it/s] 9%|▉ | 2769/30227 [08:44<1:39:15, 4.61it/s] 9%|▉ | 2772/30227 [08:44<1:40:51, 4.54it/s] 9%|▉ | 2777/30227 [08:45<1:27:55, 5.20it/s] 9%|▉ | 2779/30227 [08:46<1:42:48, 4.45it/s] 9%|▉ | 2780/30227 [08:46<2:54:12, 2.63it/s] 9%|▉ | 2783/30227 [08:47<2:33:39, 2.98it/s] 9%|▉ | 2785/30227 [08:48<2:25:28, 3.14it/s] 9%|▉ | 2789/30227 [08:48<2:06:17, 3.62it/s] 9%|▉ | 2790/30227 [08:49<3:04:20, 2.48it/s] 9%|▉ | 2791/30227 [08:50<3:41:05, 2.07it/s] 9%|▉ | 2796/30227 [08:51<2:56:32, 2.59it/s] 9%|▉ | 2806/30227 [08:51<2:14:12, 3.41it/s] 9%|▉ | 2810/30227 [08:52<1:56:08, 3.93it/s] 9%|▉ | 2811/30227 [08:53<2:59:39, 2.54it/s] 9%|▉ | 2813/30227 [08:54<3:02:48, 2.50it/s] 9%|▉ | 2816/30227 [08:54<2:39:40, 2.86it/s] 9%|▉ | 2818/30227 [08:55<2:36:43, 2.91it/s] 9%|▉ | 2824/30227 [08:55<2:03:17, 3.70it/s] 9%|▉ | 2826/30227 [08:56<2:10:29, 3.50it/s] 9%|▉ | 2828/30227 [08:57<2:17:56, 3.31it/s] 9%|▉ | 2832/30227 [08:58<2:00:47, 3.78it/s] 9%|▉ | 2836/30227 [08:58<1:44:06, 4.38it/s] 9%|▉ | 2838/30227 [08:59<1:53:46, 4.01it/s] 9%|▉ | 2839/30227 [08:59<2:53:00, 2.64it/s] 9%|▉ | 2840/30227 [09:00<3:30:47, 2.17it/s] 9%|▉ | 2841/30227 [09:01<4:11:19, 1.82it/s] 9%|▉ | 2842/30227 [09:01<4:13:10, 1.80it/s] 9%|▉ | 2845/30227 [09:02<3:23:09, 2.25it/s] 9%|▉ | 2847/30227 [09:03<3:13:25, 2.36it/s] 9%|▉ | 2850/30227 [09:03<2:44:22, 2.78it/s] 9%|▉ | 2853/30227 [09:04<2:31:12, 3.02it/s] 9%|▉ | 2855/30227 [09:05<2:39:16, 2.86it/s] 9%|▉ | 2860/30227 [09:06<2:11:47, 3.46it/s] 9%|▉ | 2861/30227 [09:06<3:08:33, 2.42it/s] 9%|▉ | 2865/30227 [09:07<2:37:18, 2.90it/s] 9%|▉ | 2866/30227 [09:08<3:31:11, 2.16it/s] 9%|▉ | 2868/30227 [09:08<3:15:12, 2.34it/s] 9%|▉ | 2871/30227 [09:09<2:48:49, 2.70it/s] 10%|▉ | 2875/30227 [09:10<2:21:16, 3.23it/s] 10%|▉ | 2879/30227 [09:10<1:59:00, 3.83it/s] 10%|▉ | 2881/30227 [09:11<2:07:20, 3.58it/s] 10%|▉ | 2883/30227 [09:12<2:17:11, 3.32it/s] 10%|▉ | 2886/30227 [09:13<2:07:48, 3.57it/s] 10%|▉ | 2891/30227 [09:13<1:48:12, 4.21it/s] 10%|▉ | 2893/30227 [09:14<2:00:16, 3.79it/s] 10%|▉ | 2897/30227 [09:15<1:51:33, 4.08it/s] 10%|▉ | 2899/30227 [09:15<2:02:53, 3.71it/s] 10%|▉ | 2902/30227 [09:16<2:03:49, 3.68it/s] 10%|▉ | 2906/30227 [09:17<1:45:20, 4.32it/s] 10%|▉ | 2908/30227 [09:17<2:01:55, 3.73it/s] 10%|▉ | 2911/30227 [09:18<1:56:10, 3.92it/s] 10%|▉ | 2915/30227 [09:19<1:46:38, 4.27it/s] 10%|▉ | 2920/30227 [09:20<1:34:02, 4.84it/s] 10%|▉ | 2921/30227 [09:20<2:35:49, 2.92it/s] 10%|▉ | 2923/30227 [09:21<2:32:57, 2.97it/s] 10%|▉ | 2926/30227 [09:21<2:16:46, 3.33it/s] 10%|▉ | 2928/30227 [09:22<2:24:52, 3.14it/s] 10%|▉ | 2929/30227 [09:23<3:11:26, 2.38it/s] 10%|▉ | 2931/30227 [09:24<3:02:23, 2.49it/s] 10%|▉ | 2937/30227 [09:24<2:22:20, 3.20it/s] 10%|▉ | 2939/30227 [09:25<2:27:39, 3.08it/s] 10%|▉ | 2941/30227 [09:26<2:30:23, 3.02it/s] 10%|▉ | 2944/30227 [09:26<2:17:24, 3.31it/s] 10%|▉ | 2948/30227 [09:27<1:58:32, 3.84it/s] 10%|▉ | 2951/30227 [09:28<1:53:05, 4.02it/s] 10%|▉ | 2954/30227 [09:28<1:51:42, 4.07it/s] 10%|▉ | 2960/30227 [09:29<1:36:24, 4.71it/s] 10%|▉ | 2963/30227 [09:30<1:42:01, 4.45it/s] 10%|▉ | 2967/30227 [09:31<1:37:04, 4.68it/s] 10%|▉ | 2972/30227 [09:31<1:25:13, 5.33it/s] 10%|▉ | 2983/30227 [09:32<1:08:18, 6.65it/s] 10%|▉ | 2989/30227 [09:33<1:03:01, 7.20it/s] 10%|▉ | 2990/30227 [09:33<2:11:11, 3.46it/s] 10%|▉ | 2991/30227 [09:34<3:09:41, 2.39it/s] 10%|▉ | 2995/30227 [09:35<2:33:17, 2.96it/s] 10%|▉ | 2999/30227 [09:35<2:07:00, 3.57it/s] 10%|▉ | 3001/30227 [09:36<2:15:52, 3.34it/s] 10%|▉ | 3003/30227 [09:37<2:21:18, 3.21it/s] 10%|▉ | 3005/30227 [09:37<2:26:18, 3.10it/s] 10%|▉ | 3006/30227 [09:38<3:13:01, 2.35it/s] 10%|▉ | 3008/30227 [09:39<3:00:12, 2.52it/s] 10%|▉ | 3010/30227 [09:39<2:53:39, 2.61it/s] 10%|▉ | 3013/30227 [09:40<2:31:54, 2.99it/s] 10%|▉ | 3015/30227 [09:41<2:35:15, 2.92it/s] 10%|▉ | 3017/30227 [09:41<2:40:43, 2.82it/s] 10%|▉ | 3019/30227 [09:42<2:31:08, 3.00it/s] 10%|▉ | 3021/30227 [09:43<2:29:25, 3.03it/s] 10%|▉ | 3022/30227 [09:43<3:16:48, 2.30it/s] 10%|█ | 3026/30227 [09:44<2:40:48, 2.82it/s] 10%|█ | 3028/30227 [09:45<2:38:37, 2.86it/s] 10%|█ | 3031/30227 [09:45<2:20:32, 3.23it/s] 10%|█ | 3033/30227 [09:46<2:26:53, 3.09it/s] 10%|█ | 3036/30227 [09:47<2:13:57, 3.38it/s] 10%|█ | 3037/30227 [09:47<3:10:01, 2.38it/s] 10%|█ | 3041/30227 [09:48<2:37:03, 2.88it/s] 10%|█ | 3047/30227 [09:49<2:07:21, 3.56it/s] 10%|█ | 3050/30227 [09:50<2:03:34, 3.67it/s] 10%|█ | 3053/30227 [09:50<1:56:08, 3.90it/s] 10%|█ | 3059/30227 [09:51<1:38:23, 4.60it/s] 10%|█ | 3063/30227 [09:52<1:28:30, 5.12it/s] 10%|█ | 3065/30227 [09:52<1:47:30, 4.21it/s] 10%|█ | 3067/30227 [09:53<2:03:04, 3.68it/s] 10%|█ | 3070/30227 [09:54<1:56:24, 3.89it/s] 10%|█ | 3073/30227 [09:54<1:50:11, 4.11it/s] 10%|█ | 3076/30227 [09:55<1:48:23, 4.17it/s] 10%|█ | 3079/30227 [09:56<1:47:11, 4.22it/s] 10%|█ | 3083/30227 [09:57<1:41:07, 4.47it/s] 10%|█ | 3086/30227 [09:57<1:40:14, 4.51it/s] 10%|█ | 3088/30227 [09:58<1:56:36, 3.88it/s] 10%|█ | 3091/30227 [09:59<1:55:03, 3.93it/s] 10%|█ | 3094/30227 [09:59<1:53:02, 4.00it/s] 10%|█ | 3095/30227 [10:00<2:53:46, 2.60it/s] 10%|█ | 3097/30227 [10:01<2:44:38, 2.75it/s] 10%|█ | 3101/30227 [10:01<2:19:14, 3.25it/s] 10%|█ | 3104/30227 [10:02<2:08:00, 3.53it/s] 10%|█ | 3106/30227 [10:03<2:13:43, 3.38it/s] 10%|█ | 3107/30227 [10:03<3:01:46, 2.49it/s] 10%|█ | 3108/30227 [10:04<3:45:37, 2.00it/s] 10%|█ | 3110/30227 [10:05<3:24:07, 2.21it/s] 10%|█ | 3112/30227 [10:05<3:11:12, 2.36it/s] 10%|█ | 3114/30227 [10:06<3:02:16, 2.48it/s] 10%|█ | 3117/30227 [10:07<2:41:14, 2.80it/s] 10%|█ | 3118/30227 [10:08<3:26:49, 2.18it/s] 10%|█ | 3128/30227 [10:08<2:34:06, 2.93it/s] 10%|█ | 3131/30227 [10:09<2:17:39, 3.28it/s] 10%|█ | 3132/30227 [10:10<3:02:30, 2.47it/s] 10%|█ | 3133/30227 [10:10<3:50:01, 1.96it/s] 10%|█ | 3135/30227 [10:11<3:20:40, 2.25it/s] 10%|█ | 3141/30227 [10:12<2:33:45, 2.94it/s] 10%|█ | 3143/30227 [10:12<2:24:08, 3.13it/s] 10%|█ | 3148/30227 [10:13<2:00:03, 3.76it/s] 10%|█ | 3150/30227 [10:13<2:08:13, 3.52it/s] 10%|█ | 3153/30227 [10:14<1:59:08, 3.79it/s] 10%|█ | 3155/30227 [10:15<2:10:07, 3.47it/s] 10%|█ | 3157/30227 [10:15<2:15:07, 3.34it/s] 10%|█ | 3159/30227 [10:16<2:19:52, 3.23it/s] 10%|█ | 3161/30227 [10:17<2:19:17, 3.24it/s] 10%|█ | 3163/30227 [10:17<2:17:48, 3.27it/s] 10%|█ | 3165/30227 [10:18<2:30:10, 3.00it/s] 10%|█ | 3167/30227 [10:19<2:39:06, 2.83it/s] 10%|█ | 3169/30227 [10:19<2:29:41, 3.01it/s] 11%|█ | 3175/30227 [10:20<2:02:16, 3.69it/s] 11%|█ | 3176/30227 [10:21<2:54:25, 2.58it/s] 11%|█ | 3177/30227 [10:22<3:36:37, 2.08it/s] 11%|█ | 3178/30227 [10:22<4:06:10, 1.83it/s] 11%|█ | 3181/30227 [10:23<3:23:10, 2.22it/s] 11%|█ | 3183/30227 [10:24<3:12:32, 2.34it/s] 11%|█ | 3185/30227 [10:24<2:58:24, 2.53it/s] 11%|█ | 3186/30227 [10:25<3:34:44, 2.10it/s] 11%|█ | 3188/30227 [10:26<3:13:36, 2.33it/s] 11%|█ | 3190/30227 [10:26<3:02:20, 2.47it/s] 11%|█ | 3194/30227 [10:27<2:31:57, 2.96it/s] 11%|█ | 3197/30227 [10:28<2:18:13, 3.26it/s] 11%|█ | 3199/30227 [10:28<2:25:25, 3.10it/s] 11%|█ | 3202/30227 [10:29<2:10:35, 3.45it/s] 11%|█ | 3208/30227 [10:30<1:47:09, 4.20it/s] 11%|█ | 3210/30227 [10:30<1:58:43, 3.79it/s] 11%|█ | 3212/30227 [10:31<2:11:07, 3.43it/s] 11%|█ | 3214/30227 [10:32<2:18:56, 3.24it/s] 11%|█ | 3218/30227 [10:33<1:59:52, 3.76it/s] 11%|█ | 3224/30227 [10:33<1:38:50, 4.55it/s] 11%|█ | 3227/30227 [10:34<1:38:33, 4.57it/s] 11%|█ | 3230/30227 [10:34<1:34:36, 4.76it/s] 11%|█ | 3233/30227 [10:35<1:29:59, 5.00it/s] 11%|█ | 3238/30227 [10:36<1:20:29, 5.59it/s] 11%|█ | 3240/30227 [10:36<1:47:14, 4.19it/s] 11%|█ | 3244/30227 [10:37<1:38:25, 4.57it/s] 11%|█ | 3246/30227 [10:38<1:52:44, 3.99it/s] 11%|█ | 3248/30227 [10:38<2:06:19, 3.56it/s] 11%|█ | 3249/30227 [10:39<2:55:32, 2.56it/s] 11%|█ | 3252/30227 [10:40<2:33:57, 2.92it/s] 11%|█ | 3254/30227 [10:40<2:34:03, 2.92it/s] 11%|█ | 3256/30227 [10:41<2:41:05, 2.79it/s] 11%|█ | 3257/30227 [10:42<3:08:33, 2.38it/s] 11%|█ | 3261/30227 [10:42<2:33:54, 2.92it/s] 11%|█ | 3262/30227 [10:43<3:21:54, 2.23it/s] 11%|█ | 3264/30227 [10:44<2:57:23, 2.53it/s] 11%|█ | 3272/30227 [10:44<2:15:40, 3.31it/s] 11%|█ | 3275/30227 [10:45<2:09:04, 3.48it/s] 11%|█ | 3277/30227 [10:46<2:24:01, 3.12it/s] 11%|█ | 3280/30227 [10:47<2:11:34, 3.41it/s] 11%|█ | 3282/30227 [10:47<2:10:49, 3.43it/s] 11%|█ | 3285/30227 [10:48<2:03:23, 3.64it/s] 11%|█ | 3288/30227 [10:49<1:56:48, 3.84it/s] 11%|█ | 3290/30227 [10:49<2:06:16, 3.56it/s] 11%|█ | 3292/30227 [10:50<2:16:13, 3.30it/s] 11%|█ | 3294/30227 [10:51<2:20:45, 3.19it/s] 11%|█ | 3296/30227 [10:51<2:18:04, 3.25it/s] 11%|█ | 3309/30227 [10:52<1:43:34, 4.33it/s] 11%|█ | 3314/30227 [10:53<1:30:48, 4.94it/s] 11%|█ | 3316/30227 [10:53<1:55:39, 3.88it/s] 11%|█ | 3321/30227 [10:54<1:40:15, 4.47it/s] 11%|█ | 3323/30227 [10:55<1:58:23, 3.79it/s] 11%|█ | 3325/30227 [10:55<2:01:57, 3.68it/s] 11%|█ | 3328/30227 [10:56<1:55:40, 3.88it/s] 11%|█ | 3330/30227 [10:57<2:04:56, 3.59it/s] 11%|█ | 3333/30227 [10:57<1:59:39, 3.75it/s] 11%|█ | 3338/30227 [10:58<1:41:11, 4.43it/s] 11%|█ | 3345/30227 [10:59<1:22:05, 5.46it/s] 11%|█ | 3347/30227 [10:59<1:42:56, 4.35it/s] 11%|█ | 3351/30227 [11:00<1:33:32, 4.79it/s] 11%|█ | 3358/30227 [11:01<1:18:09, 5.73it/s] 11%|█ | 3367/30227 [11:01<1:03:04, 7.10it/s]
--------------------------------------------------------------------------- KeyboardInterrupt Traceback (most recent call last) <ipython-input-23-6c5f53c4af46> in <module>() ----> 1 save_yolov3_data_from_rsna(train_dcm_dir, img_dir, label_dir, annots) <ipython-input-22-e3d442d121ae> in save_yolov3_data_from_rsna(dcm_dir, img_dir, label_dir, annots) 52 continue 53 save_label_from_dcm(label_dir, patient_id, row) ---> 54 save_img_from_dcm(dcm_dir, img_dir, patient_id) <ipython-input-22-e3d442d121ae> in save_img_from_dcm(dcm_dir, img_dir, patient_id) 4 return 5 dcm_fp = os.path.join(dcm_dir, "{}.dcm".format(patient_id)) ----> 6 img_1ch = pydicom.read_file(dcm_fp).pixel_array 7 img_3ch = np.stack([img_1ch]*3, -1) 8 /usr/local/lib/python3.7/dist-packages/pydicom/filereader.py in dcmread(fp, defer_size, stop_before_pixels, force, specific_tags) 886 try: 887 dataset = read_partial(fp, stop_when, defer_size=defer_size, --> 888 force=force, specific_tags=specific_tags) 889 finally: 890 if not caller_owns_file: /usr/local/lib/python3.7/dist-packages/pydicom/filereader.py in read_partial(fileobj, stop_when, defer_size, force, specific_tags) 668 669 # Read preamble (if present) --> 670 preamble = read_preamble(fileobj, force) 671 # Read any File Meta Information group (0002,eeee) elements (if present) 672 file_meta_dataset = _read_file_meta_info(fileobj) /usr/local/lib/python3.7/dist-packages/pydicom/filereader.py in read_preamble(fp, force) 605 """ 606 logger.debug("Reading File Meta Information preamble...") --> 607 preamble = fp.read(128) 608 if config.debugging: 609 sample = bytes2hex(preamble[:8]) + "..." + bytes2hex(preamble[-8:]) KeyboardInterrupt:
!du -sh images labels
161M images 3.9M labels
Plot a sample train image and label
ex_patient_id = annots[annots.Target == 1].patientId.values[0]
ex_img_path = os.path.join(img_dir, "{}.jpg".format(ex_patient_id))
ex_label_path = os.path.join(label_dir, "{}.txt".format(ex_patient_id))
plt.imshow(cv2.imread(ex_img_path))
img_size = 1014
with open(ex_label_path, "r") as f:
for line in f:
print(line)
class_id, rcx, rcy, rw, rh = list(map(float, line.strip().split()))
x = (rcx-rw/2)*img_size
y = (rcy-rh/2)*img_size
w = rw*img_size
h = rh*img_size
plt.plot([x, x, x+w, x+w, x], [y, y+h, y+h, y, y])
0 0.36181640625 0.33349609375 0.2080078125 0.3701171875 0 0.36181640625 0.33349609375 0.2080078125 0.3701171875 0 0.673828125 0.36962890625 0.25 0.4423828125
We should give the list of image paths to YOLO. two seperate list textfiles for training images and validation images.
def write_train_list(metadata_dir, img_dir, name, series):
list_fp = os.path.join(metadata_dir, name)
with open(list_fp, "w") as f:
for patient_id in series:
line = "{}\n".format(os.path.join(img_dir, "{}.jpg".format(patient_id)))
f.write(line)
from sklearn.model_selection import train_test_split
# Following lines do not contain data with no bbox
patient_id_series = annots[annots.Target == 1].patientId.drop_duplicates()
random_stat = 123
np.random.seed(random_stat)
tr_series, val_series = train_test_split(patient_id_series, test_size=0.1, random_state=random_stat)
print("The # of train set: {}, The # of validation set: {}".format(tr_series.shape[0], val_series.shape[0]))
# train image path list
write_train_list(metadata_dir, img_dir, "tr_list.txt", tr_series)
# validation image path list
write_train_list(metadata_dir, img_dir, "val_list.txt", val_series)
The # of train set: 5410, The # of validation set: 602
Create test image and labels for YOLOv3
def save_yolov3_test_data(test_dcm_dir, img_dir, metadata_dir, name, series):
list_fp = os.path.join(metadata_dir, name)
with open(list_fp, "w") as f:
for patient_id in series:
save_img_from_dcm(test_dcm_dir, img_dir, patient_id)
line = "{}\n".format(os.path.join(img_dir, "{}.jpg".format(patient_id)))
f.write(line)
test_dcm_fps = list(set(glob(os.path.join(test_dcm_dir, '*.dcm'))))
test_dcm_fps = pd.Series(test_dcm_fps).apply(lambda dcm_fp: dcm_fp.strip().split("/")[-1].replace(".dcm",""))
save_yolov3_test_data(test_dcm_dir, img_dir, metadata_dir, "te_list.txt", test_dcm_fps)
Plot a sample test Image
ex_patient_id = test_dcm_fps[0]
ex_img_path = os.path.join(img_dir, "{}.jpg".format(ex_patient_id))
plt.imshow(cv2.imread(ex_img_path))
<matplotlib.image.AxesImage at 0x7fe986aa80d0>
Prepare Configuration Files for Using YOLOv3
/content/Mask_RCNN/cfg/rsna.data
/content/Mask_RCNN/cfg/rsna.names
/content/Mask_RCNN/darknet53.conv.74
/content/Mask_RCNN/cfg/rsna_yolov3.cfg_train
data_extention_file_path = os.path.join(cfg_dir, 'rsna.data')
with open(data_extention_file_path, 'w') as f:
contents = """classes= 1
train = {}
valid = {}
names = {}
backup = {}
""".format(os.path.join(metadata_dir, "tr_list.txt"),
os.path.join(metadata_dir, "val_list.txt"),
os.path.join(cfg_dir, 'rsna.names'),
backup_dir)
f.write(contents)
!cat cfg/rsna.data
classes= 1
train = /content/darknet/yolov3/metadata/tr_list.txt
valid = /content/darknet/yolov3/metadata/val_list.txt
names = /content/darknet/yolov3/cfg/rsna.names
backup = /content/darknet/yolov3/backup
# Label list of bounding box.
!echo "pneumonia" > cfg/rsna.names
For training, we would download the pre-trained model weights(darknet53.conv.74) using following wget command. Author of darknet also uses this pre-trained weights in different fields of image recognition
!wget -q https://pjreddie.com/media/files/darknet53.conv.74
!wget --no-check-certificate -q "https://docs.google.com/uc?export=download&id=18ptTK4Vbeokqpux8Onr0OmwUP9ipmcYO" -O cfg/rsna_yolov3.cfg_train
!wget --no-check-certificate -q "https://docs.google.com/uc?export=download&id=1OhnlV3s7r6xsEme6DKkNYjcYjsl-C_Av" -O train_log.txt
Training YOLOv3
iters = []
losses = []
total_losses = []
with open("train_log.txt", 'r') as f:
for i,line in enumerate(f):
if "images" in line:
iters.append(int(line.strip().split()[0].split(":")[0]))
losses.append(float(line.strip().split()[2]))
total_losses.append(float(line.strip().split()[1].split(',')[0]))
plt.figure(figsize=(20, 5))
plt.subplot(1,2,1)
sns.lineplot(iters, total_losses, label="total loss")
sns.lineplot(iters, losses, label="avg loss")
plt.xlabel("Iteration")
plt.ylabel("Loss")
plt.subplot(1,2,2)
sns.lineplot(iters, total_losses, label="total loss")
sns.lineplot(iters, losses, label="avg loss")
plt.xlabel("Iteration")
plt.ylabel("Loss")
plt.ylim([0, 4.05])
(0.0, 4.05)
Use trainined YOLOv3 for test images
import shutil
ex_patient_id = annots[annots.Target == 1].patientId.values[2]
shutil.copy(ex_img_path, "test.jpg")
print(ex_patient_id)
00704310-78a8-4b38-8475-49f4573b2dbb
!wget --load-cookies /tmp/cookies.txt -q "https://docs.google.com/uc?export=download&confirm=$(wget --quiet --save-cookies /tmp/cookies.txt --keep-session-cookies --no-check-certificate 'https://docs.google.com/uc?export=download&id=1FDzMN-kGVYCvBeDKwemAazldSVkAEFyd' -O- | sed -rn 's/.*confirm=([0-9A-Za-z_]+).*/\1\n/p')&id=1FDzMN-kGVYCvBeDKwemAazldSVkAEFyd" -O backup/rsna_yolov3_15300.weights && rm -rf /tmp/cookies.txt
!ls -alsth backup
total 235M 235M -rw-r--r-- 1 root root 235M Jun 13 06:52 rsna_yolov3_15300.weights 4.0K drwxr-xr-x 2 root root 4.0K Jun 13 06:52 . 4.0K drwxr-xr-x 13 root root 4.0K Jun 13 06:52 ..
!wget --no-check-certificate -q "https://docs.google.com/uc?export=download&id=10Yk6ZMAKGz5LeBbikciALy82aK3lX-57" -O cfg/rsna_yolov3.cfg_test
!/content/darknet detector test ../cfg/rsna.data ../cfg/rsna_yolov3.cfg_test ../backup/rsna_yolov3_15300.weights ../test.jpg -thresh 0.005
/bin/bash: /content/darknet: Is a directory
import cv2
import matplotlib.pyplot as plt
import os.path
fig,ax = plt.subplots()
ax.tick_params(labelbottom="off",bottom="off")
ax.tick_params(labelleft="off",left="off")
ax.set_xticklabels([])
ax.axis('off')
file = '/content/test.jpg'
if os.path.exists(file):
img = cv2.imread(file)
show_img = cv2.cvtColor(img, cv2.COLOR_BGR2RGB)
plt.imshow(show_img)
We have run below code to covert the images and labels into an array and to save the time for next code execution, we covert into .npy format and saved in local directry. Fron next time we can directly read the .npy from directory. This will save time from reading the array again and again.
#from pathlib import Path
#trainImagesCategories = []
#trainImg = []
#scaleTo = 32
#for i in range(len(data)):
# dcm_file = train_path+'%s.dcm' % data.loc[i, "patientId"]
# path = Path(dcm_file)
# if path.is_file():
# trainImagesCategories.append(data.loc[i, "Target"]) # labels
# trainImg.append(cv2.resize(pydicom.dcmread(dcm_file).pixel_array,(scaleTo, scaleTo))) # images
#trainImgNParray = np.array(trainImg) # create an array of all the images (not the paths)
#trainlabel = pd.DataFrame(trainImagesCategories) # dataframe of all the categories matching each image
# checking the shape of the first image.
#trainImgNParray[1].shape
#trainImgNParray
trainImgNParray=np.load('/content/drive/MyDrive/Raw Data/trainImgNParray.npy')
trainlabel=np.load('/content/drive/MyDrive/Raw Data/trainlabel.npy')
# normalize the training data
trainImgNParray = trainImgNParray/255
trainImgNParray.shape
(29145, 32, 32, 1)
datagen = ImageDataGenerator(
featurewise_center=False,
samplewise_center=False,
featurewise_std_normalization=False,
samplewise_std_normalization=False,
zca_whitening=False,
rotation_range=30,
zoom_range = 0.1,
width_shift_range=0.1,
height_shift_range=0.1,
horizontal_flip=True,
vertical_flip=True)
datagen.fit(trainImgNParray)
from sklearn.model_selection import train_test_split
# I used a 70% - 30% test train split, stratified according to training labels.
trainX, testX, trainY, testY = train_test_split(trainImgNParray, trainlabel,
test_size=0.30, random_state=1,
stratify = trainlabel)
scaleTo = 32
model = Sequential()
model.add(Conv2D(32 , (3,3) , strides = 1 , padding = 'same' , activation = 'relu' , input_shape = (scaleTo,scaleTo,1)))
model.add(BatchNormalization())
model.add(MaxPool2D((2,2) , strides = 1 , padding = 'same'))
model.add(Conv2D(64 , (3,3) , strides = 1 , padding = 'same' , activation = 'relu'))
#model.add(Dropout(0.1))
model.add(BatchNormalization())
model.add(MaxPool2D((2,2) , strides = 1 , padding = 'same'))
model.add(Conv2D(120 , (3,3) , strides = 1 , padding = 'same' , activation = 'relu'))
model.add(BatchNormalization())
model.add(MaxPool2D((2,2) , strides = 1 , padding = 'same'))
model.add(Conv2D(120 , (3,3) , strides = 1 , padding = 'same' , activation = 'relu'))
model.add(BatchNormalization())
model.add(MaxPool2D((2,2) , strides = 1 , padding = 'same'))
model.add(Conv2D(150 , (3,3) , strides = 1 , padding = 'same' , activation = 'relu'))
model.add(BatchNormalization())
model.add(MaxPool2D((2,2) , strides = 1 , padding = 'same'))
model.add(Flatten())
#model.add(Dense(units = 6 , activation = 'relu'))
#model.add(Dropout(0.2))
model.add(Dense(units = 1 , activation = 'sigmoid'))
model.compile(optimizer = "Adam" , loss = 'binary_crossentropy' , metrics = ['accuracy'])
model.summary()
Model: "sequential" _________________________________________________________________ Layer (type) Output Shape Param # ================================================================= conv2d (Conv2D) (None, 32, 32, 32) 320 _________________________________________________________________ batch_normalization (BatchNo (None, 32, 32, 32) 128 _________________________________________________________________ max_pooling2d (MaxPooling2D) (None, 32, 32, 32) 0 _________________________________________________________________ conv2d_1 (Conv2D) (None, 32, 32, 64) 18496 _________________________________________________________________ batch_normalization_1 (Batch (None, 32, 32, 64) 256 _________________________________________________________________ max_pooling2d_1 (MaxPooling2 (None, 32, 32, 64) 0 _________________________________________________________________ conv2d_2 (Conv2D) (None, 32, 32, 120) 69240 _________________________________________________________________ batch_normalization_2 (Batch (None, 32, 32, 120) 480 _________________________________________________________________ max_pooling2d_2 (MaxPooling2 (None, 32, 32, 120) 0 _________________________________________________________________ conv2d_3 (Conv2D) (None, 32, 32, 120) 129720 _________________________________________________________________ batch_normalization_3 (Batch (None, 32, 32, 120) 480 _________________________________________________________________ max_pooling2d_3 (MaxPooling2 (None, 32, 32, 120) 0 _________________________________________________________________ conv2d_4 (Conv2D) (None, 32, 32, 150) 162150 _________________________________________________________________ batch_normalization_4 (Batch (None, 32, 32, 150) 600 _________________________________________________________________ max_pooling2d_4 (MaxPooling2 (None, 32, 32, 150) 0 _________________________________________________________________ flatten (Flatten) (None, 153600) 0 _________________________________________________________________ dense (Dense) (None, 1) 153601 ================================================================= Total params: 535,471 Trainable params: 534,499 Non-trainable params: 972 _________________________________________________________________
from tensorflow.keras.callbacks import ReduceLROnPlateau
learning_rate_reduction = ReduceLROnPlateau(monitor='val_accuracy', patience = 2, verbose=1,factor=0.3, min_lr=0.01)
history = model.fit(trainX,trainY, batch_size = 100 ,epochs = 50 , validation_data = (testX, testY) ,callbacks = [learning_rate_reduction])
Epoch 1/50 205/205 [==============================] - 24s 38ms/step - loss: 2.2639 - accuracy: 0.7025 - val_loss: 2.0000 - val_accuracy: 0.6794 Epoch 2/50 205/205 [==============================] - 7s 36ms/step - loss: 2.2298 - accuracy: 0.7086 - val_loss: 0.6734 - val_accuracy: 0.6846 Epoch 3/50 205/205 [==============================] - 7s 35ms/step - loss: 1.0987 - accuracy: 0.7269 - val_loss: 0.6256 - val_accuracy: 0.7710 Epoch 4/50 205/205 [==============================] - 7s 35ms/step - loss: 0.8238 - accuracy: 0.7364 - val_loss: 9.6653 - val_accuracy: 0.3225 Epoch 5/50 205/205 [==============================] - 7s 35ms/step - loss: 2.0675 - accuracy: 0.7080 - val_loss: 12.4083 - val_accuracy: 0.3337 Epoch 6/50 205/205 [==============================] - 7s 35ms/step - loss: 1.8130 - accuracy: 0.7077 - val_loss: 6.5801 - val_accuracy: 0.6793 Epoch 7/50 205/205 [==============================] - 7s 35ms/step - loss: 1.7394 - accuracy: 0.7052 - val_loss: 1.5016 - val_accuracy: 0.6874 Epoch 8/50 205/205 [==============================] - 7s 35ms/step - loss: 0.9058 - accuracy: 0.7366 - val_loss: 1.9999 - val_accuracy: 0.6842 Epoch 9/50 205/205 [==============================] - 7s 35ms/step - loss: 1.2576 - accuracy: 0.7263 - val_loss: 1.2530 - val_accuracy: 0.7088 Epoch 10/50 205/205 [==============================] - 7s 35ms/step - loss: 0.7103 - accuracy: 0.7435 - val_loss: 0.8768 - val_accuracy: 0.7395 Epoch 11/50 205/205 [==============================] - 7s 35ms/step - loss: 0.9689 - accuracy: 0.7444 - val_loss: 0.5938 - val_accuracy: 0.7730 Epoch 12/50 205/205 [==============================] - 7s 35ms/step - loss: 0.6483 - accuracy: 0.7609 - val_loss: 2.0450 - val_accuracy: 0.6855 Epoch 13/50 205/205 [==============================] - 7s 35ms/step - loss: 0.9011 - accuracy: 0.7481 - val_loss: 2.0144 - val_accuracy: 0.4887 Epoch 14/50 205/205 [==============================] - 7s 35ms/step - loss: 0.8965 - accuracy: 0.7451 - val_loss: 1.0867 - val_accuracy: 0.7157 Epoch 15/50 205/205 [==============================] - 7s 35ms/step - loss: 0.8783 - accuracy: 0.7396 - val_loss: 1.4680 - val_accuracy: 0.6796 Epoch 16/50 205/205 [==============================] - 7s 35ms/step - loss: 1.0818 - accuracy: 0.7267 - val_loss: 0.5236 - val_accuracy: 0.7717 Epoch 17/50 205/205 [==============================] - 7s 35ms/step - loss: 0.5662 - accuracy: 0.7657 - val_loss: 2.0843 - val_accuracy: 0.4337 Epoch 18/50 205/205 [==============================] - 7s 35ms/step - loss: 0.7632 - accuracy: 0.7472 - val_loss: 1.1637 - val_accuracy: 0.7498 Epoch 19/50 205/205 [==============================] - 7s 35ms/step - loss: 0.5652 - accuracy: 0.7679 - val_loss: 0.7241 - val_accuracy: 0.6568 Epoch 20/50 205/205 [==============================] - 7s 35ms/step - loss: 0.8539 - accuracy: 0.7479 - val_loss: 0.5211 - val_accuracy: 0.7646 Epoch 21/50 205/205 [==============================] - 7s 35ms/step - loss: 0.5497 - accuracy: 0.7658 - val_loss: 0.4905 - val_accuracy: 0.7656 Epoch 22/50 205/205 [==============================] - 7s 35ms/step - loss: 0.4955 - accuracy: 0.7769 - val_loss: 0.4985 - val_accuracy: 0.7705 Epoch 23/50 205/205 [==============================] - 7s 35ms/step - loss: 0.5077 - accuracy: 0.7819 - val_loss: 0.6858 - val_accuracy: 0.6751 Epoch 24/50 205/205 [==============================] - 7s 35ms/step - loss: 0.6671 - accuracy: 0.7492 - val_loss: 0.7318 - val_accuracy: 0.7065 Epoch 25/50 205/205 [==============================] - 7s 35ms/step - loss: 0.5182 - accuracy: 0.7759 - val_loss: 0.5346 - val_accuracy: 0.7367 Epoch 26/50 205/205 [==============================] - 7s 35ms/step - loss: 0.5191 - accuracy: 0.7726 - val_loss: 0.5155 - val_accuracy: 0.7630 Epoch 27/50 205/205 [==============================] - 7s 35ms/step - loss: 0.4692 - accuracy: 0.7879 - val_loss: 0.5129 - val_accuracy: 0.7667 Epoch 28/50 205/205 [==============================] - 7s 35ms/step - loss: 0.4612 - accuracy: 0.7943 - val_loss: 0.7510 - val_accuracy: 0.7402 Epoch 29/50 205/205 [==============================] - 7s 35ms/step - loss: 0.5633 - accuracy: 0.7681 - val_loss: 0.5023 - val_accuracy: 0.7597 Epoch 30/50 205/205 [==============================] - 7s 35ms/step - loss: 0.4762 - accuracy: 0.7856 - val_loss: 0.5906 - val_accuracy: 0.7306 Epoch 31/50 205/205 [==============================] - 7s 35ms/step - loss: 0.4607 - accuracy: 0.7877 - val_loss: 0.9885 - val_accuracy: 0.5499 Epoch 32/50 205/205 [==============================] - 7s 35ms/step - loss: 0.5633 - accuracy: 0.7600 - val_loss: 0.5242 - val_accuracy: 0.7664 Epoch 33/50 205/205 [==============================] - 7s 35ms/step - loss: 0.4921 - accuracy: 0.7803 - val_loss: 0.5148 - val_accuracy: 0.7754 Epoch 34/50 205/205 [==============================] - 7s 35ms/step - loss: 0.4745 - accuracy: 0.7860 - val_loss: 0.4735 - val_accuracy: 0.7795 Epoch 35/50 205/205 [==============================] - 7s 35ms/step - loss: 0.4645 - accuracy: 0.7923 - val_loss: 0.5125 - val_accuracy: 0.7729 Epoch 36/50 205/205 [==============================] - 7s 35ms/step - loss: 0.4593 - accuracy: 0.7920 - val_loss: 0.4717 - val_accuracy: 0.7825 Epoch 37/50 205/205 [==============================] - 7s 35ms/step - loss: 0.4511 - accuracy: 0.7963 - val_loss: 0.5170 - val_accuracy: 0.7667 Epoch 38/50 205/205 [==============================] - 7s 35ms/step - loss: 0.4871 - accuracy: 0.7787 - val_loss: 0.4758 - val_accuracy: 0.7779 Epoch 39/50 205/205 [==============================] - 7s 35ms/step - loss: 0.4568 - accuracy: 0.7935 - val_loss: 0.4541 - val_accuracy: 0.7844 Epoch 40/50 205/205 [==============================] - 7s 35ms/step - loss: 0.4401 - accuracy: 0.7976 - val_loss: 0.5099 - val_accuracy: 0.7577 Epoch 41/50 205/205 [==============================] - 7s 35ms/step - loss: 0.4697 - accuracy: 0.7898 - val_loss: 0.4549 - val_accuracy: 0.7847 Epoch 42/50 205/205 [==============================] - 7s 35ms/step - loss: 0.4382 - accuracy: 0.7996 - val_loss: 0.4917 - val_accuracy: 0.7693 Epoch 43/50 205/205 [==============================] - 7s 35ms/step - loss: 0.4383 - accuracy: 0.7992 - val_loss: 0.6881 - val_accuracy: 0.7332 Epoch 44/50 205/205 [==============================] - 7s 35ms/step - loss: 0.4922 - accuracy: 0.7821 - val_loss: 0.4549 - val_accuracy: 0.7864 Epoch 45/50 205/205 [==============================] - 7s 35ms/step - loss: 0.4389 - accuracy: 0.7998 - val_loss: 0.5768 - val_accuracy: 0.7096 Epoch 46/50 205/205 [==============================] - 7s 35ms/step - loss: 0.4668 - accuracy: 0.7824 - val_loss: 0.5047 - val_accuracy: 0.7638 Epoch 47/50 205/205 [==============================] - 7s 35ms/step - loss: 0.4701 - accuracy: 0.7827 - val_loss: 0.4781 - val_accuracy: 0.7738 Epoch 48/50 205/205 [==============================] - 7s 35ms/step - loss: 0.4383 - accuracy: 0.7952 - val_loss: 0.4514 - val_accuracy: 0.7885 Epoch 49/50 205/205 [==============================] - 7s 35ms/step - loss: 0.4315 - accuracy: 0.8005 - val_loss: 0.4589 - val_accuracy: 0.7861 Epoch 50/50 205/205 [==============================] - 7s 35ms/step - loss: 0.4260 - accuracy: 0.8030 - val_loss: 0.4949 - val_accuracy: 0.7704
# Evaluate the model
model.evaluate(testX,testY)
274/274 [==============================] - 1s 5ms/step - loss: 0.4949 - accuracy: 0.7704
[0.49486997723579407, 0.7703568339347839]
plt.figure(figsize=(8,6))
plt.title('Accuracy scores')
plt.plot(history.history['accuracy'])
plt.plot(history.history['val_accuracy'])
plt.legend(['accuracy', 'val_accuracy'])
plt.show()
plt.figure(figsize=(8,6))
plt.title('Loss value')
plt.plot(history.history['loss'])
plt.plot(history.history['val_loss'])
plt.legend(['loss', 'val_loss'])
plt.show()
from imblearn.over_sampling import SMOTE
dataForSmote = trainImgNParray.reshape(29145, scaleTo * scaleTo * 1)
smote = SMOTE(sampling_strategy = 0.8)
x_smote, y_smote = smote.fit_resample(dataForSmote , trainlabel)
x_smote.shape
(35643, 1024)
X_smote= x_smote.reshape(35643, scaleTo, scaleTo, 1)
X_smote.shape
(35643, 32, 32, 1)
y_smote.shape
(35643,)
from sklearn.model_selection import train_test_split
# I used a 70% - 30% test train split, stratified according to training labels.
trainX_smote, testX_smote, trainY_smote, testY_smote = train_test_split(X_smote, y_smote,
test_size=0.30, random_state=1,
stratify = y_smote)
model_smote = Sequential()
model_smote.add(Conv2D(32 , (3,3) , strides = 1 , padding = 'same' , activation = 'relu' , input_shape = (scaleTo,scaleTo,1)))
model_smote.add(BatchNormalization())
model_smote.add(MaxPool2D((2,2) , strides = 1 , padding = 'same'))
model_smote.add(Conv2D(64 , (3,3) , strides = 1 , padding = 'same' , activation = 'relu'))
#model.add(Dropout(0.1))
model_smote.add(BatchNormalization())
model_smote.add(MaxPool2D((2,2) , strides = 1 , padding = 'same'))
model_smote.add(Conv2D(120 , (3,3) , strides = 1 , padding = 'same' , activation = 'relu'))
model_smote.add(BatchNormalization())
model_smote.add(MaxPool2D((2,2) , strides = 1 , padding = 'same'))
model_smote.add(Conv2D(120 , (3,3) , strides = 1 , padding = 'same' , activation = 'relu'))
model_smote.add(BatchNormalization())
model_smote.add(MaxPool2D((2,2) , strides = 1 , padding = 'same'))
model_smote.add(Conv2D(150 , (3,3) , strides = 1 , padding = 'same' , activation = 'relu'))
model_smote.add(BatchNormalization())
model_smote.add(MaxPool2D((2,2) , strides = 1 , padding = 'same'))
model_smote.add(Flatten())
#model.add(Dense(units = 6 , activation = 'relu'))
#model.add(Dropout(0.2))
model_smote.add(Dense(units = 1 , activation = 'sigmoid'))
model_smote.compile(optimizer = "Adam" , loss = 'binary_crossentropy' , metrics = ['accuracy'])
model_smote.summary()
Model: "sequential_3" _________________________________________________________________ Layer (type) Output Shape Param # ================================================================= conv2d_15 (Conv2D) (None, 32, 32, 32) 320 _________________________________________________________________ batch_normalization_15 (Batc (None, 32, 32, 32) 128 _________________________________________________________________ max_pooling2d_15 (MaxPooling (None, 32, 32, 32) 0 _________________________________________________________________ conv2d_16 (Conv2D) (None, 32, 32, 64) 18496 _________________________________________________________________ batch_normalization_16 (Batc (None, 32, 32, 64) 256 _________________________________________________________________ max_pooling2d_16 (MaxPooling (None, 32, 32, 64) 0 _________________________________________________________________ conv2d_17 (Conv2D) (None, 32, 32, 120) 69240 _________________________________________________________________ batch_normalization_17 (Batc (None, 32, 32, 120) 480 _________________________________________________________________ max_pooling2d_17 (MaxPooling (None, 32, 32, 120) 0 _________________________________________________________________ conv2d_18 (Conv2D) (None, 32, 32, 120) 129720 _________________________________________________________________ batch_normalization_18 (Batc (None, 32, 32, 120) 480 _________________________________________________________________ max_pooling2d_18 (MaxPooling (None, 32, 32, 120) 0 _________________________________________________________________ conv2d_19 (Conv2D) (None, 32, 32, 150) 162150 _________________________________________________________________ batch_normalization_19 (Batc (None, 32, 32, 150) 600 _________________________________________________________________ max_pooling2d_19 (MaxPooling (None, 32, 32, 150) 0 _________________________________________________________________ flatten_5 (Flatten) (None, 153600) 0 _________________________________________________________________ dense_5 (Dense) (None, 1) 153601 ================================================================= Total params: 535,471 Trainable params: 534,499 Non-trainable params: 972 _________________________________________________________________
history_smote = model_smote.fit(trainX_smote,trainY_smote, batch_size = 100 ,epochs = 50 , validation_data = (testX_smote, testY_smote) ,callbacks = [learning_rate_reduction])
Epoch 1/50 250/250 [==============================] - 10s 37ms/step - loss: 2.1608 - accuracy: 0.6937 - val_loss: 1.8753 - val_accuracy: 0.4444 Epoch 2/50 250/250 [==============================] - 9s 36ms/step - loss: 1.2495 - accuracy: 0.7169 - val_loss: 0.5644 - val_accuracy: 0.7067 Epoch 3/50 250/250 [==============================] - 9s 36ms/step - loss: 0.8283 - accuracy: 0.7410 - val_loss: 0.9862 - val_accuracy: 0.7136 Epoch 4/50 250/250 [==============================] - 9s 36ms/step - loss: 0.7727 - accuracy: 0.7508 - val_loss: 0.5360 - val_accuracy: 0.7594 Epoch 5/50 250/250 [==============================] - 9s 36ms/step - loss: 0.6438 - accuracy: 0.7606 - val_loss: 0.8637 - val_accuracy: 0.7521 Epoch 6/50 250/250 [==============================] - 9s 36ms/step - loss: 0.5906 - accuracy: 0.7755 - val_loss: 0.5235 - val_accuracy: 0.7554 Epoch 7/50 250/250 [==============================] - 9s 36ms/step - loss: 0.8738 - accuracy: 0.7505 - val_loss: 0.5609 - val_accuracy: 0.7440 Epoch 8/50 250/250 [==============================] - 9s 36ms/step - loss: 0.5976 - accuracy: 0.7687 - val_loss: 0.5380 - val_accuracy: 0.7655 Epoch 9/50 250/250 [==============================] - 9s 36ms/step - loss: 0.5892 - accuracy: 0.7674 - val_loss: 0.5271 - val_accuracy: 0.7432 Epoch 10/50 250/250 [==============================] - 9s 36ms/step - loss: 0.5229 - accuracy: 0.7818 - val_loss: 0.4854 - val_accuracy: 0.7775 Epoch 11/50 250/250 [==============================] - 9s 36ms/step - loss: 0.5319 - accuracy: 0.7849 - val_loss: 0.4636 - val_accuracy: 0.7886 Epoch 12/50 250/250 [==============================] - 9s 36ms/step - loss: 0.4356 - accuracy: 0.8076 - val_loss: 0.4822 - val_accuracy: 0.7993 Epoch 13/50 250/250 [==============================] - 9s 36ms/step - loss: 0.4229 - accuracy: 0.8132 - val_loss: 0.4775 - val_accuracy: 0.7835 Epoch 14/50 250/250 [==============================] - 9s 36ms/step - loss: 0.3935 - accuracy: 0.8263 - val_loss: 0.4519 - val_accuracy: 0.8024 Epoch 15/50 250/250 [==============================] - 9s 36ms/step - loss: 0.3884 - accuracy: 0.8335 - val_loss: 0.4837 - val_accuracy: 0.7797 Epoch 16/50 250/250 [==============================] - 9s 36ms/step - loss: 0.3757 - accuracy: 0.8375 - val_loss: 0.4871 - val_accuracy: 0.7990 Epoch 17/50 250/250 [==============================] - 9s 36ms/step - loss: 0.3596 - accuracy: 0.8470 - val_loss: 0.5647 - val_accuracy: 0.7517 Epoch 18/50 250/250 [==============================] - 9s 36ms/step - loss: 0.3536 - accuracy: 0.8497 - val_loss: 0.4648 - val_accuracy: 0.8144 Epoch 19/50 250/250 [==============================] - 9s 36ms/step - loss: 0.3462 - accuracy: 0.8544 - val_loss: 0.5422 - val_accuracy: 0.7783 Epoch 20/50 250/250 [==============================] - 9s 36ms/step - loss: 0.3336 - accuracy: 0.8578 - val_loss: 0.6236 - val_accuracy: 0.7568 Epoch 21/50 250/250 [==============================] - 9s 36ms/step - loss: 0.3060 - accuracy: 0.8728 - val_loss: 0.4721 - val_accuracy: 0.8082 Epoch 22/50 250/250 [==============================] - 9s 36ms/step - loss: 0.2776 - accuracy: 0.8835 - val_loss: 0.5664 - val_accuracy: 0.7896 Epoch 23/50 250/250 [==============================] - 9s 36ms/step - loss: 0.2569 - accuracy: 0.8960 - val_loss: 0.5192 - val_accuracy: 0.7989 Epoch 24/50 250/250 [==============================] - 9s 36ms/step - loss: 0.2382 - accuracy: 0.9036 - val_loss: 0.5862 - val_accuracy: 0.8013 Epoch 25/50 250/250 [==============================] - 9s 36ms/step - loss: 0.2188 - accuracy: 0.9106 - val_loss: 0.8999 - val_accuracy: 0.7201 Epoch 26/50 250/250 [==============================] - 9s 36ms/step - loss: 0.1967 - accuracy: 0.9209 - val_loss: 0.6424 - val_accuracy: 0.7583 Epoch 27/50 250/250 [==============================] - 9s 36ms/step - loss: 0.1781 - accuracy: 0.9285 - val_loss: 0.6775 - val_accuracy: 0.7597 Epoch 28/50 250/250 [==============================] - 9s 36ms/step - loss: 0.1572 - accuracy: 0.9371 - val_loss: 0.6224 - val_accuracy: 0.8192 Epoch 29/50 250/250 [==============================] - 9s 36ms/step - loss: 0.1564 - accuracy: 0.9385 - val_loss: 0.7494 - val_accuracy: 0.8074 Epoch 30/50 250/250 [==============================] - 9s 36ms/step - loss: 0.1218 - accuracy: 0.9527 - val_loss: 0.7142 - val_accuracy: 0.8269 Epoch 31/50 250/250 [==============================] - 9s 36ms/step - loss: 0.1168 - accuracy: 0.9542 - val_loss: 0.7983 - val_accuracy: 0.7913 Epoch 32/50 250/250 [==============================] - 9s 36ms/step - loss: 0.1211 - accuracy: 0.9543 - val_loss: 0.8297 - val_accuracy: 0.7618 Epoch 33/50 250/250 [==============================] - 9s 37ms/step - loss: 0.0979 - accuracy: 0.9627 - val_loss: 0.7820 - val_accuracy: 0.7963 Epoch 34/50 250/250 [==============================] - 9s 37ms/step - loss: 0.0822 - accuracy: 0.9688 - val_loss: 0.7718 - val_accuracy: 0.8318 Epoch 35/50 250/250 [==============================] - 9s 37ms/step - loss: 0.0709 - accuracy: 0.9739 - val_loss: 0.7573 - val_accuracy: 0.8398 Epoch 36/50 250/250 [==============================] - 9s 36ms/step - loss: 0.1062 - accuracy: 0.9594 - val_loss: 0.9721 - val_accuracy: 0.8125 Epoch 37/50 250/250 [==============================] - 9s 36ms/step - loss: 0.0823 - accuracy: 0.9673 - val_loss: 0.7669 - val_accuracy: 0.8345 Epoch 38/50 250/250 [==============================] - 9s 36ms/step - loss: 0.0659 - accuracy: 0.9747 - val_loss: 0.9877 - val_accuracy: 0.8184 Epoch 39/50 250/250 [==============================] - 9s 36ms/step - loss: 0.0511 - accuracy: 0.9808 - val_loss: 0.8458 - val_accuracy: 0.8360 Epoch 40/50 250/250 [==============================] - 9s 36ms/step - loss: 0.0578 - accuracy: 0.9790 - val_loss: 1.0554 - val_accuracy: 0.8275 Epoch 41/50 250/250 [==============================] - 9s 36ms/step - loss: 0.0855 - accuracy: 0.9686 - val_loss: 0.8627 - val_accuracy: 0.8222 Epoch 42/50 250/250 [==============================] - 9s 36ms/step - loss: 0.0698 - accuracy: 0.9737 - val_loss: 1.0207 - val_accuracy: 0.8289 Epoch 43/50 250/250 [==============================] - 9s 36ms/step - loss: 0.0447 - accuracy: 0.9842 - val_loss: 1.1613 - val_accuracy: 0.8045 Epoch 44/50 250/250 [==============================] - 9s 36ms/step - loss: 0.0395 - accuracy: 0.9862 - val_loss: 1.0537 - val_accuracy: 0.8123 Epoch 45/50 250/250 [==============================] - 9s 36ms/step - loss: 0.0599 - accuracy: 0.9787 - val_loss: 1.0702 - val_accuracy: 0.8021 Epoch 46/50 250/250 [==============================] - 9s 36ms/step - loss: 0.0596 - accuracy: 0.9785 - val_loss: 1.2563 - val_accuracy: 0.8042 Epoch 47/50 250/250 [==============================] - 9s 36ms/step - loss: 0.0777 - accuracy: 0.9721 - val_loss: 0.8951 - val_accuracy: 0.8265 Epoch 48/50 250/250 [==============================] - 9s 36ms/step - loss: 0.0403 - accuracy: 0.9855 - val_loss: 0.9613 - val_accuracy: 0.8476 Epoch 49/50 250/250 [==============================] - 9s 36ms/step - loss: 0.0220 - accuracy: 0.9926 - val_loss: 0.9492 - val_accuracy: 0.8580 Epoch 50/50 250/250 [==============================] - 9s 37ms/step - loss: 0.0143 - accuracy: 0.9957 - val_loss: 0.9783 - val_accuracy: 0.8553
# Evaluate the model
model_smote.evaluate(testX_smote,testY_smote)
335/335 [==============================] - 2s 5ms/step - loss: 0.9783 - accuracy: 0.8553
[0.978251576423645, 0.8553259372711182]
plt.figure(figsize=(8,6))
plt.title('Accuracy scores')
plt.plot(history_smote.history['accuracy'])
plt.plot(history_smote.history['val_accuracy'])
plt.legend(['accuracy', 'val_accuracy'])
plt.show()
plt.figure(figsize=(8,6))
plt.title('Loss value')
plt.plot(history_smote.history['loss'])
plt.plot(history_smote.history['val_loss'])
plt.legend(['loss', 'val_loss'])
plt.show()
from tensorflow.keras.applications import VGG16
from tensorflow.keras.losses import binary_crossentropy
trainX=np.repeat(trainX,3,-1)
testX=np.repeat(testX,3,-1)
print(trainX.shape)
print(testX.shape)
(20401, 32, 32, 3) (8744, 32, 32, 3)
# re-size all the images
IMAGE_SIZE = [scaleTo, scaleTo]
# add preprocessing layer to the front of VGG
vgg = VGG16(input_shape=IMAGE_SIZE + [3], weights='imagenet', include_top=False)
Downloading data from https://storage.googleapis.com/tensorflow/keras-applications/vgg16/vgg16_weights_tf_dim_ordering_tf_kernels_notop.h5 58892288/58889256 [==============================] - 0s 0us/step
# don't train existing weights
for layer in vgg.layers:
layer.trainable = False
x = Flatten()(vgg.output)
prediction = Dense(1, activation='sigmoid')(x)
# create a model object
vgg_model = Model(inputs=vgg.input, outputs=prediction)
# view the structure of the model
vgg_model.summary()
Model: "model" _________________________________________________________________ Layer (type) Output Shape Param # ================================================================= input_1 (InputLayer) [(None, 32, 32, 3)] 0 _________________________________________________________________ block1_conv1 (Conv2D) (None, 32, 32, 64) 1792 _________________________________________________________________ block1_conv2 (Conv2D) (None, 32, 32, 64) 36928 _________________________________________________________________ block1_pool (MaxPooling2D) (None, 16, 16, 64) 0 _________________________________________________________________ block2_conv1 (Conv2D) (None, 16, 16, 128) 73856 _________________________________________________________________ block2_conv2 (Conv2D) (None, 16, 16, 128) 147584 _________________________________________________________________ block2_pool (MaxPooling2D) (None, 8, 8, 128) 0 _________________________________________________________________ block3_conv1 (Conv2D) (None, 8, 8, 256) 295168 _________________________________________________________________ block3_conv2 (Conv2D) (None, 8, 8, 256) 590080 _________________________________________________________________ block3_conv3 (Conv2D) (None, 8, 8, 256) 590080 _________________________________________________________________ block3_pool (MaxPooling2D) (None, 4, 4, 256) 0 _________________________________________________________________ block4_conv1 (Conv2D) (None, 4, 4, 512) 1180160 _________________________________________________________________ block4_conv2 (Conv2D) (None, 4, 4, 512) 2359808 _________________________________________________________________ block4_conv3 (Conv2D) (None, 4, 4, 512) 2359808 _________________________________________________________________ block4_pool (MaxPooling2D) (None, 2, 2, 512) 0 _________________________________________________________________ block5_conv1 (Conv2D) (None, 2, 2, 512) 2359808 _________________________________________________________________ block5_conv2 (Conv2D) (None, 2, 2, 512) 2359808 _________________________________________________________________ block5_conv3 (Conv2D) (None, 2, 2, 512) 2359808 _________________________________________________________________ block5_pool (MaxPooling2D) (None, 1, 1, 512) 0 _________________________________________________________________ flatten_3 (Flatten) (None, 512) 0 _________________________________________________________________ dense_3 (Dense) (None, 1) 513 ================================================================= Total params: 14,715,201 Trainable params: 513 Non-trainable params: 14,714,688 _________________________________________________________________
# Compile the model
vgg_model.compile(
loss= binary_crossentropy,
optimizer='sgd',
metrics=['accuracy']
)
Vgghistory = vgg_model.fit(trainX, trainY, batch_size=100, epochs=40, validation_data=(testX, testY))
Epoch 1/40 205/205 [==============================] - 5s 17ms/step - loss: 0.5508 - accuracy: 0.7159 - val_loss: 0.5349 - val_accuracy: 0.7119 Epoch 2/40 205/205 [==============================] - 3s 14ms/step - loss: 0.5001 - accuracy: 0.7604 - val_loss: 0.6064 - val_accuracy: 0.6824 Epoch 3/40 205/205 [==============================] - 3s 17ms/step - loss: 0.4907 - accuracy: 0.7677 - val_loss: 0.5103 - val_accuracy: 0.7587 Epoch 4/40 205/205 [==============================] - 3s 14ms/step - loss: 0.4845 - accuracy: 0.7716 - val_loss: 0.5270 - val_accuracy: 0.7360 Epoch 5/40 205/205 [==============================] - 3s 14ms/step - loss: 0.4820 - accuracy: 0.7738 - val_loss: 0.5263 - val_accuracy: 0.7465 Epoch 6/40 205/205 [==============================] - 3s 14ms/step - loss: 0.4797 - accuracy: 0.7746 - val_loss: 0.5368 - val_accuracy: 0.7327 Epoch 7/40 205/205 [==============================] - 3s 14ms/step - loss: 0.4772 - accuracy: 0.7749 - val_loss: 0.4970 - val_accuracy: 0.7559 Epoch 8/40 205/205 [==============================] - 3s 14ms/step - loss: 0.4756 - accuracy: 0.7772 - val_loss: 0.5078 - val_accuracy: 0.7507 Epoch 9/40 205/205 [==============================] - 3s 14ms/step - loss: 0.4736 - accuracy: 0.7784 - val_loss: 0.5442 - val_accuracy: 0.7301 Epoch 10/40 205/205 [==============================] - 3s 17ms/step - loss: 0.4735 - accuracy: 0.7783 - val_loss: 0.4903 - val_accuracy: 0.7594 Epoch 11/40 205/205 [==============================] - 3s 14ms/step - loss: 0.4716 - accuracy: 0.7771 - val_loss: 0.4854 - val_accuracy: 0.7659 Epoch 12/40 205/205 [==============================] - 3s 14ms/step - loss: 0.4707 - accuracy: 0.7794 - val_loss: 0.4794 - val_accuracy: 0.7748 Epoch 13/40 205/205 [==============================] - 3s 14ms/step - loss: 0.4698 - accuracy: 0.7806 - val_loss: 0.4796 - val_accuracy: 0.7738 Epoch 14/40 205/205 [==============================] - 3s 14ms/step - loss: 0.4693 - accuracy: 0.7797 - val_loss: 0.4780 - val_accuracy: 0.7755 Epoch 15/40 205/205 [==============================] - 3s 14ms/step - loss: 0.4686 - accuracy: 0.7793 - val_loss: 0.4917 - val_accuracy: 0.7707 Epoch 16/40 205/205 [==============================] - 3s 14ms/step - loss: 0.4683 - accuracy: 0.7783 - val_loss: 0.5060 - val_accuracy: 0.7522 Epoch 17/40 205/205 [==============================] - 3s 14ms/step - loss: 0.4678 - accuracy: 0.7800 - val_loss: 0.4778 - val_accuracy: 0.7748 Epoch 18/40 205/205 [==============================] - 3s 14ms/step - loss: 0.4672 - accuracy: 0.7813 - val_loss: 0.5524 - val_accuracy: 0.7295 Epoch 19/40 205/205 [==============================] - 3s 14ms/step - loss: 0.4672 - accuracy: 0.7801 - val_loss: 0.5001 - val_accuracy: 0.7650 Epoch 20/40 205/205 [==============================] - 3s 14ms/step - loss: 0.4664 - accuracy: 0.7809 - val_loss: 0.4753 - val_accuracy: 0.7764 Epoch 21/40 205/205 [==============================] - 3s 14ms/step - loss: 0.4659 - accuracy: 0.7811 - val_loss: 0.4977 - val_accuracy: 0.7661 Epoch 22/40 205/205 [==============================] - 3s 14ms/step - loss: 0.4657 - accuracy: 0.7814 - val_loss: 0.5648 - val_accuracy: 0.7229 Epoch 23/40 205/205 [==============================] - 3s 14ms/step - loss: 0.4662 - accuracy: 0.7808 - val_loss: 0.5080 - val_accuracy: 0.7619 Epoch 24/40 205/205 [==============================] - 3s 14ms/step - loss: 0.4652 - accuracy: 0.7801 - val_loss: 0.4760 - val_accuracy: 0.7750 Epoch 25/40 205/205 [==============================] - 3s 14ms/step - loss: 0.4642 - accuracy: 0.7817 - val_loss: 0.4752 - val_accuracy: 0.7753 Epoch 26/40 205/205 [==============================] - 3s 14ms/step - loss: 0.4640 - accuracy: 0.7817 - val_loss: 0.4921 - val_accuracy: 0.7611 Epoch 27/40 205/205 [==============================] - 3s 14ms/step - loss: 0.4641 - accuracy: 0.7806 - val_loss: 0.5544 - val_accuracy: 0.7294 Epoch 28/40 205/205 [==============================] - 3s 14ms/step - loss: 0.4646 - accuracy: 0.7825 - val_loss: 0.5436 - val_accuracy: 0.7372 Epoch 29/40 205/205 [==============================] - 3s 14ms/step - loss: 0.4646 - accuracy: 0.7817 - val_loss: 0.4762 - val_accuracy: 0.7739 Epoch 30/40 205/205 [==============================] - 3s 14ms/step - loss: 0.4633 - accuracy: 0.7827 - val_loss: 0.4746 - val_accuracy: 0.7752 Epoch 31/40 205/205 [==============================] - 3s 14ms/step - loss: 0.4627 - accuracy: 0.7826 - val_loss: 0.5033 - val_accuracy: 0.7627 Epoch 32/40 205/205 [==============================] - 3s 14ms/step - loss: 0.4631 - accuracy: 0.7830 - val_loss: 0.4725 - val_accuracy: 0.7765 Epoch 33/40 205/205 [==============================] - 3s 14ms/step - loss: 0.4623 - accuracy: 0.7832 - val_loss: 0.4767 - val_accuracy: 0.7738 Epoch 34/40 205/205 [==============================] - 3s 14ms/step - loss: 0.4623 - accuracy: 0.7838 - val_loss: 0.5418 - val_accuracy: 0.7373 Epoch 35/40 205/205 [==============================] - 3s 14ms/step - loss: 0.4631 - accuracy: 0.7806 - val_loss: 0.6005 - val_accuracy: 0.7133 Epoch 36/40 205/205 [==============================] - 3s 14ms/step - loss: 0.4638 - accuracy: 0.7816 - val_loss: 0.4799 - val_accuracy: 0.7682 Epoch 37/40 205/205 [==============================] - 3s 14ms/step - loss: 0.4621 - accuracy: 0.7830 - val_loss: 0.5697 - val_accuracy: 0.7244 Epoch 38/40 205/205 [==============================] - 3s 14ms/step - loss: 0.4629 - accuracy: 0.7807 - val_loss: 0.4734 - val_accuracy: 0.7756 Epoch 39/40 205/205 [==============================] - 3s 14ms/step - loss: 0.4614 - accuracy: 0.7830 - val_loss: 0.5228 - val_accuracy: 0.7508 Epoch 40/40 205/205 [==============================] - 3s 17ms/step - loss: 0.4619 - accuracy: 0.7819 - val_loss: 0.4848 - val_accuracy: 0.7749
# Evaluate the model
vgg_model.evaluate(testX,testY)
274/274 [==============================] - 2s 7ms/step - loss: 0.4848 - accuracy: 0.7749
[0.48479315638542175, 0.7749313712120056]
plt.figure(figsize=(8,6))
plt.title('Accuracy scores')
plt.plot(Vgghistory.history['accuracy'])
plt.plot(Vgghistory.history['val_accuracy'])
plt.legend(['accuracy', 'val_accuracy'])
plt.show()
plt.figure(figsize=(8,6))
plt.title('Loss value')
plt.plot(Vgghistory.history['loss'])
plt.plot(Vgghistory.history['val_loss'])
plt.legend(['loss', 'val_loss'])
plt.show()
trainX_smote=np.repeat(trainX_smote,3,-1)
testX_smote=np.repeat(testX_smote,3,-1)
Vgghistory = vgg_model.fit(trainX_smote, trainY_smote, batch_size=100, epochs=50, validation_data=(testX_smote, testY_smote))
Epoch 1/50 250/250 [==============================] - 5s 18ms/step - loss: 0.4980 - accuracy: 0.7611 - val_loss: 0.5017 - val_accuracy: 0.7620 Epoch 2/50 250/250 [==============================] - 4s 15ms/step - loss: 0.4958 - accuracy: 0.7632 - val_loss: 0.5035 - val_accuracy: 0.7612 Epoch 3/50 250/250 [==============================] - 4s 15ms/step - loss: 0.4948 - accuracy: 0.7638 - val_loss: 0.4993 - val_accuracy: 0.7643 Epoch 4/50 250/250 [==============================] - 4s 15ms/step - loss: 0.4939 - accuracy: 0.7642 - val_loss: 0.4979 - val_accuracy: 0.7653 Epoch 5/50 250/250 [==============================] - 4s 15ms/step - loss: 0.4932 - accuracy: 0.7636 - val_loss: 0.4975 - val_accuracy: 0.7657 Epoch 6/50 250/250 [==============================] - 4s 15ms/step - loss: 0.4922 - accuracy: 0.7644 - val_loss: 0.4973 - val_accuracy: 0.7648 Epoch 7/50 250/250 [==============================] - 4s 15ms/step - loss: 0.4915 - accuracy: 0.7650 - val_loss: 0.4974 - val_accuracy: 0.7648 Epoch 8/50 250/250 [==============================] - 4s 15ms/step - loss: 0.4913 - accuracy: 0.7656 - val_loss: 0.4965 - val_accuracy: 0.7664 Epoch 9/50 250/250 [==============================] - 4s 15ms/step - loss: 0.4909 - accuracy: 0.7664 - val_loss: 0.4964 - val_accuracy: 0.7657 Epoch 10/50 250/250 [==============================] - 4s 15ms/step - loss: 0.4906 - accuracy: 0.7666 - val_loss: 0.4964 - val_accuracy: 0.7661 Epoch 11/50 250/250 [==============================] - 4s 15ms/step - loss: 0.4901 - accuracy: 0.7658 - val_loss: 0.4950 - val_accuracy: 0.7672 Epoch 12/50 250/250 [==============================] - 4s 15ms/step - loss: 0.4900 - accuracy: 0.7667 - val_loss: 0.4960 - val_accuracy: 0.7649 Epoch 13/50 250/250 [==============================] - 4s 15ms/step - loss: 0.4895 - accuracy: 0.7682 - val_loss: 0.4944 - val_accuracy: 0.7667 Epoch 14/50 250/250 [==============================] - 4s 15ms/step - loss: 0.4892 - accuracy: 0.7677 - val_loss: 0.4945 - val_accuracy: 0.7672 Epoch 15/50 250/250 [==============================] - 4s 15ms/step - loss: 0.4889 - accuracy: 0.7706 - val_loss: 0.4962 - val_accuracy: 0.7636 Epoch 16/50 250/250 [==============================] - 4s 15ms/step - loss: 0.4886 - accuracy: 0.7679 - val_loss: 0.4950 - val_accuracy: 0.7683 Epoch 17/50 250/250 [==============================] - 4s 15ms/step - loss: 0.4882 - accuracy: 0.7676 - val_loss: 0.4945 - val_accuracy: 0.7690 Epoch 18/50 250/250 [==============================] - 4s 15ms/step - loss: 0.4881 - accuracy: 0.7683 - val_loss: 0.4934 - val_accuracy: 0.7677 Epoch 19/50 250/250 [==============================] - 4s 15ms/step - loss: 0.4881 - accuracy: 0.7687 - val_loss: 0.4932 - val_accuracy: 0.7684 Epoch 20/50 250/250 [==============================] - 4s 15ms/step - loss: 0.4883 - accuracy: 0.7681 - val_loss: 0.4951 - val_accuracy: 0.7653 Epoch 21/50 250/250 [==============================] - 4s 15ms/step - loss: 0.4875 - accuracy: 0.7669 - val_loss: 0.4928 - val_accuracy: 0.7683 Epoch 22/50 250/250 [==============================] - 4s 15ms/step - loss: 0.4873 - accuracy: 0.7687 - val_loss: 0.4934 - val_accuracy: 0.7702 Epoch 23/50 250/250 [==============================] - 4s 15ms/step - loss: 0.4876 - accuracy: 0.7688 - val_loss: 0.4932 - val_accuracy: 0.7699 Epoch 24/50 250/250 [==============================] - 4s 15ms/step - loss: 0.4872 - accuracy: 0.7676 - val_loss: 0.4955 - val_accuracy: 0.7636 Epoch 25/50 250/250 [==============================] - 4s 15ms/step - loss: 0.4869 - accuracy: 0.7693 - val_loss: 0.4942 - val_accuracy: 0.7656 Epoch 26/50 250/250 [==============================] - 4s 15ms/step - loss: 0.4868 - accuracy: 0.7688 - val_loss: 0.4926 - val_accuracy: 0.7677 Epoch 27/50 250/250 [==============================] - 4s 15ms/step - loss: 0.4866 - accuracy: 0.7686 - val_loss: 0.4942 - val_accuracy: 0.7676 Epoch 28/50 250/250 [==============================] - 4s 15ms/step - loss: 0.4863 - accuracy: 0.7695 - val_loss: 0.4922 - val_accuracy: 0.7690 Epoch 29/50 250/250 [==============================] - 4s 15ms/step - loss: 0.4863 - accuracy: 0.7695 - val_loss: 0.4922 - val_accuracy: 0.7684 Epoch 30/50 250/250 [==============================] - 4s 15ms/step - loss: 0.4862 - accuracy: 0.7684 - val_loss: 0.4916 - val_accuracy: 0.7687 Epoch 31/50 250/250 [==============================] - 4s 15ms/step - loss: 0.4859 - accuracy: 0.7699 - val_loss: 0.4920 - val_accuracy: 0.7680 Epoch 32/50 250/250 [==============================] - 4s 15ms/step - loss: 0.4855 - accuracy: 0.7706 - val_loss: 0.4917 - val_accuracy: 0.7700 Epoch 33/50 250/250 [==============================] - 4s 15ms/step - loss: 0.4856 - accuracy: 0.7699 - val_loss: 0.4932 - val_accuracy: 0.7654 Epoch 34/50 250/250 [==============================] - 4s 16ms/step - loss: 0.4854 - accuracy: 0.7692 - val_loss: 0.4943 - val_accuracy: 0.7687 Epoch 35/50 250/250 [==============================] - 4s 15ms/step - loss: 0.4855 - accuracy: 0.7710 - val_loss: 0.4912 - val_accuracy: 0.7685 Epoch 36/50 250/250 [==============================] - 4s 15ms/step - loss: 0.4851 - accuracy: 0.7705 - val_loss: 0.4958 - val_accuracy: 0.7681 Epoch 37/50 250/250 [==============================] - 4s 15ms/step - loss: 0.4852 - accuracy: 0.7706 - val_loss: 0.4960 - val_accuracy: 0.7627 Epoch 38/50 250/250 [==============================] - 4s 15ms/step - loss: 0.4852 - accuracy: 0.7698 - val_loss: 0.4909 - val_accuracy: 0.7694 Epoch 39/50 250/250 [==============================] - 4s 15ms/step - loss: 0.4849 - accuracy: 0.7701 - val_loss: 0.4906 - val_accuracy: 0.7699 Epoch 40/50 250/250 [==============================] - 4s 15ms/step - loss: 0.4848 - accuracy: 0.7703 - val_loss: 0.4906 - val_accuracy: 0.7695 Epoch 41/50 250/250 [==============================] - 4s 15ms/step - loss: 0.4848 - accuracy: 0.7703 - val_loss: 0.4924 - val_accuracy: 0.7652 Epoch 42/50 250/250 [==============================] - 4s 15ms/step - loss: 0.4843 - accuracy: 0.7706 - val_loss: 0.4905 - val_accuracy: 0.7695 Epoch 43/50 250/250 [==============================] - 4s 15ms/step - loss: 0.4842 - accuracy: 0.7704 - val_loss: 0.4902 - val_accuracy: 0.7688 Epoch 44/50 250/250 [==============================] - 4s 15ms/step - loss: 0.4842 - accuracy: 0.7710 - val_loss: 0.4903 - val_accuracy: 0.7692 Epoch 45/50 250/250 [==============================] - 4s 15ms/step - loss: 0.4841 - accuracy: 0.7701 - val_loss: 0.4906 - val_accuracy: 0.7682 Epoch 46/50 250/250 [==============================] - 4s 15ms/step - loss: 0.4840 - accuracy: 0.7709 - val_loss: 0.4905 - val_accuracy: 0.7682 Epoch 47/50 250/250 [==============================] - 4s 15ms/step - loss: 0.4838 - accuracy: 0.7714 - val_loss: 0.4900 - val_accuracy: 0.7697 Epoch 48/50 250/250 [==============================] - 4s 15ms/step - loss: 0.4835 - accuracy: 0.7699 - val_loss: 0.4926 - val_accuracy: 0.7649 Epoch 49/50 250/250 [==============================] - 4s 15ms/step - loss: 0.4835 - accuracy: 0.7705 - val_loss: 0.4909 - val_accuracy: 0.7711 Epoch 50/50 250/250 [==============================] - 4s 15ms/step - loss: 0.4833 - accuracy: 0.7711 - val_loss: 0.4897 - val_accuracy: 0.7692
from tensorflow.keras.applications import VGG19, ResNet50, MobileNet
# add preprocessing layer to the front of VGG
Mobilenet = MobileNet(input_shape=IMAGE_SIZE + [3], weights= 'imagenet', include_top=False)
WARNING:tensorflow:`input_shape` is undefined or non-square, or `rows` is not in [128, 160, 192, 224]. Weights for input shape (224, 224) will be loaded as the default. Downloading data from https://storage.googleapis.com/tensorflow/keras-applications/mobilenet/mobilenet_1_0_224_tf_no_top.h5 17227776/17225924 [==============================] - 0s 0us/step
# don't train existing weights
for layer in Mobilenet.layers:
layer.trainable = False
x2 = Flatten()(Mobilenet.output)
prediction2 = Dense(1, activation='sigmoid')(x2)
# create a model object
Mobilenet_model = Model(inputs=Mobilenet.input, outputs=prediction2)
# view the structure of the model
Mobilenet_model.summary()
Model: "model_1" _________________________________________________________________ Layer (type) Output Shape Param # ================================================================= input_2 (InputLayer) [(None, 32, 32, 3)] 0 _________________________________________________________________ conv1 (Conv2D) (None, 16, 16, 32) 864 _________________________________________________________________ conv1_bn (BatchNormalization (None, 16, 16, 32) 128 _________________________________________________________________ conv1_relu (ReLU) (None, 16, 16, 32) 0 _________________________________________________________________ conv_dw_1 (DepthwiseConv2D) (None, 16, 16, 32) 288 _________________________________________________________________ conv_dw_1_bn (BatchNormaliza (None, 16, 16, 32) 128 _________________________________________________________________ conv_dw_1_relu (ReLU) (None, 16, 16, 32) 0 _________________________________________________________________ conv_pw_1 (Conv2D) (None, 16, 16, 64) 2048 _________________________________________________________________ conv_pw_1_bn (BatchNormaliza (None, 16, 16, 64) 256 _________________________________________________________________ conv_pw_1_relu (ReLU) (None, 16, 16, 64) 0 _________________________________________________________________ conv_pad_2 (ZeroPadding2D) (None, 17, 17, 64) 0 _________________________________________________________________ conv_dw_2 (DepthwiseConv2D) (None, 8, 8, 64) 576 _________________________________________________________________ conv_dw_2_bn (BatchNormaliza (None, 8, 8, 64) 256 _________________________________________________________________ conv_dw_2_relu (ReLU) (None, 8, 8, 64) 0 _________________________________________________________________ conv_pw_2 (Conv2D) (None, 8, 8, 128) 8192 _________________________________________________________________ conv_pw_2_bn (BatchNormaliza (None, 8, 8, 128) 512 _________________________________________________________________ conv_pw_2_relu (ReLU) (None, 8, 8, 128) 0 _________________________________________________________________ conv_dw_3 (DepthwiseConv2D) (None, 8, 8, 128) 1152 _________________________________________________________________ conv_dw_3_bn (BatchNormaliza (None, 8, 8, 128) 512 _________________________________________________________________ conv_dw_3_relu (ReLU) (None, 8, 8, 128) 0 _________________________________________________________________ conv_pw_3 (Conv2D) (None, 8, 8, 128) 16384 _________________________________________________________________ conv_pw_3_bn (BatchNormaliza (None, 8, 8, 128) 512 _________________________________________________________________ conv_pw_3_relu (ReLU) (None, 8, 8, 128) 0 _________________________________________________________________ conv_pad_4 (ZeroPadding2D) (None, 9, 9, 128) 0 _________________________________________________________________ conv_dw_4 (DepthwiseConv2D) (None, 4, 4, 128) 1152 _________________________________________________________________ conv_dw_4_bn (BatchNormaliza (None, 4, 4, 128) 512 _________________________________________________________________ conv_dw_4_relu (ReLU) (None, 4, 4, 128) 0 _________________________________________________________________ conv_pw_4 (Conv2D) (None, 4, 4, 256) 32768 _________________________________________________________________ conv_pw_4_bn (BatchNormaliza (None, 4, 4, 256) 1024 _________________________________________________________________ conv_pw_4_relu (ReLU) (None, 4, 4, 256) 0 _________________________________________________________________ conv_dw_5 (DepthwiseConv2D) (None, 4, 4, 256) 2304 _________________________________________________________________ conv_dw_5_bn (BatchNormaliza (None, 4, 4, 256) 1024 _________________________________________________________________ conv_dw_5_relu (ReLU) (None, 4, 4, 256) 0 _________________________________________________________________ conv_pw_5 (Conv2D) (None, 4, 4, 256) 65536 _________________________________________________________________ conv_pw_5_bn (BatchNormaliza (None, 4, 4, 256) 1024 _________________________________________________________________ conv_pw_5_relu (ReLU) (None, 4, 4, 256) 0 _________________________________________________________________ conv_pad_6 (ZeroPadding2D) (None, 5, 5, 256) 0 _________________________________________________________________ conv_dw_6 (DepthwiseConv2D) (None, 2, 2, 256) 2304 _________________________________________________________________ conv_dw_6_bn (BatchNormaliza (None, 2, 2, 256) 1024 _________________________________________________________________ conv_dw_6_relu (ReLU) (None, 2, 2, 256) 0 _________________________________________________________________ conv_pw_6 (Conv2D) (None, 2, 2, 512) 131072 _________________________________________________________________ conv_pw_6_bn (BatchNormaliza (None, 2, 2, 512) 2048 _________________________________________________________________ conv_pw_6_relu (ReLU) (None, 2, 2, 512) 0 _________________________________________________________________ conv_dw_7 (DepthwiseConv2D) (None, 2, 2, 512) 4608 _________________________________________________________________ conv_dw_7_bn (BatchNormaliza (None, 2, 2, 512) 2048 _________________________________________________________________ conv_dw_7_relu (ReLU) (None, 2, 2, 512) 0 _________________________________________________________________ conv_pw_7 (Conv2D) (None, 2, 2, 512) 262144 _________________________________________________________________ conv_pw_7_bn (BatchNormaliza (None, 2, 2, 512) 2048 _________________________________________________________________ conv_pw_7_relu (ReLU) (None, 2, 2, 512) 0 _________________________________________________________________ conv_dw_8 (DepthwiseConv2D) (None, 2, 2, 512) 4608 _________________________________________________________________ conv_dw_8_bn (BatchNormaliza (None, 2, 2, 512) 2048 _________________________________________________________________ conv_dw_8_relu (ReLU) (None, 2, 2, 512) 0 _________________________________________________________________ conv_pw_8 (Conv2D) (None, 2, 2, 512) 262144 _________________________________________________________________ conv_pw_8_bn (BatchNormaliza (None, 2, 2, 512) 2048 _________________________________________________________________ conv_pw_8_relu (ReLU) (None, 2, 2, 512) 0 _________________________________________________________________ conv_dw_9 (DepthwiseConv2D) (None, 2, 2, 512) 4608 _________________________________________________________________ conv_dw_9_bn (BatchNormaliza (None, 2, 2, 512) 2048 _________________________________________________________________ conv_dw_9_relu (ReLU) (None, 2, 2, 512) 0 _________________________________________________________________ conv_pw_9 (Conv2D) (None, 2, 2, 512) 262144 _________________________________________________________________ conv_pw_9_bn (BatchNormaliza (None, 2, 2, 512) 2048 _________________________________________________________________ conv_pw_9_relu (ReLU) (None, 2, 2, 512) 0 _________________________________________________________________ conv_dw_10 (DepthwiseConv2D) (None, 2, 2, 512) 4608 _________________________________________________________________ conv_dw_10_bn (BatchNormaliz (None, 2, 2, 512) 2048 _________________________________________________________________ conv_dw_10_relu (ReLU) (None, 2, 2, 512) 0 _________________________________________________________________ conv_pw_10 (Conv2D) (None, 2, 2, 512) 262144 _________________________________________________________________ conv_pw_10_bn (BatchNormaliz (None, 2, 2, 512) 2048 _________________________________________________________________ conv_pw_10_relu (ReLU) (None, 2, 2, 512) 0 _________________________________________________________________ conv_dw_11 (DepthwiseConv2D) (None, 2, 2, 512) 4608 _________________________________________________________________ conv_dw_11_bn (BatchNormaliz (None, 2, 2, 512) 2048 _________________________________________________________________ conv_dw_11_relu (ReLU) (None, 2, 2, 512) 0 _________________________________________________________________ conv_pw_11 (Conv2D) (None, 2, 2, 512) 262144 _________________________________________________________________ conv_pw_11_bn (BatchNormaliz (None, 2, 2, 512) 2048 _________________________________________________________________ conv_pw_11_relu (ReLU) (None, 2, 2, 512) 0 _________________________________________________________________ conv_pad_12 (ZeroPadding2D) (None, 3, 3, 512) 0 _________________________________________________________________ conv_dw_12 (DepthwiseConv2D) (None, 1, 1, 512) 4608 _________________________________________________________________ conv_dw_12_bn (BatchNormaliz (None, 1, 1, 512) 2048 _________________________________________________________________ conv_dw_12_relu (ReLU) (None, 1, 1, 512) 0 _________________________________________________________________ conv_pw_12 (Conv2D) (None, 1, 1, 1024) 524288 _________________________________________________________________ conv_pw_12_bn (BatchNormaliz (None, 1, 1, 1024) 4096 _________________________________________________________________ conv_pw_12_relu (ReLU) (None, 1, 1, 1024) 0 _________________________________________________________________ conv_dw_13 (DepthwiseConv2D) (None, 1, 1, 1024) 9216 _________________________________________________________________ conv_dw_13_bn (BatchNormaliz (None, 1, 1, 1024) 4096 _________________________________________________________________ conv_dw_13_relu (ReLU) (None, 1, 1, 1024) 0 _________________________________________________________________ conv_pw_13 (Conv2D) (None, 1, 1, 1024) 1048576 _________________________________________________________________ conv_pw_13_bn (BatchNormaliz (None, 1, 1, 1024) 4096 _________________________________________________________________ conv_pw_13_relu (ReLU) (None, 1, 1, 1024) 0 _________________________________________________________________ flatten_4 (Flatten) (None, 1024) 0 _________________________________________________________________ dense_4 (Dense) (None, 1) 1025 ================================================================= Total params: 3,229,889 Trainable params: 1,025 Non-trainable params: 3,228,864 _________________________________________________________________
# Compile the model
Mobilenet_model.compile(
loss= binary_crossentropy,
optimizer='adam',
metrics=['accuracy']
)
Mobilenet_history = Mobilenet_model.fit(trainX, trainY, batch_size=100, epochs=40, validation_data=(testX, testY))
Epoch 1/40 205/205 [==============================] - 5s 14ms/step - loss: 0.6380 - accuracy: 0.6647 - val_loss: 0.6137 - val_accuracy: 0.6810 Epoch 2/40 205/205 [==============================] - 2s 11ms/step - loss: 0.6071 - accuracy: 0.6844 - val_loss: 0.6016 - val_accuracy: 0.6881 Epoch 3/40 205/205 [==============================] - 2s 11ms/step - loss: 0.5987 - accuracy: 0.6895 - val_loss: 0.5977 - val_accuracy: 0.6895 Epoch 4/40 205/205 [==============================] - 2s 11ms/step - loss: 0.5957 - accuracy: 0.6894 - val_loss: 0.5962 - val_accuracy: 0.6908 Epoch 5/40 205/205 [==============================] - 2s 11ms/step - loss: 0.5948 - accuracy: 0.6897 - val_loss: 0.5958 - val_accuracy: 0.6916 Epoch 6/40 205/205 [==============================] - 2s 11ms/step - loss: 0.5945 - accuracy: 0.6901 - val_loss: 0.5956 - val_accuracy: 0.6909 Epoch 7/40 205/205 [==============================] - 2s 11ms/step - loss: 0.5941 - accuracy: 0.6905 - val_loss: 0.5956 - val_accuracy: 0.6936 Epoch 8/40 205/205 [==============================] - 2s 11ms/step - loss: 0.5940 - accuracy: 0.6909 - val_loss: 0.5954 - val_accuracy: 0.6922 Epoch 9/40 205/205 [==============================] - 2s 11ms/step - loss: 0.5938 - accuracy: 0.6909 - val_loss: 0.5953 - val_accuracy: 0.6910 Epoch 10/40 205/205 [==============================] - 2s 11ms/step - loss: 0.5939 - accuracy: 0.6905 - val_loss: 0.5952 - val_accuracy: 0.6925 Epoch 11/40 205/205 [==============================] - 2s 11ms/step - loss: 0.5937 - accuracy: 0.6913 - val_loss: 0.5951 - val_accuracy: 0.6922 Epoch 12/40 205/205 [==============================] - 2s 11ms/step - loss: 0.5936 - accuracy: 0.6920 - val_loss: 0.5950 - val_accuracy: 0.6922 Epoch 13/40 205/205 [==============================] - 2s 11ms/step - loss: 0.5936 - accuracy: 0.6911 - val_loss: 0.5950 - val_accuracy: 0.6945 Epoch 14/40 205/205 [==============================] - 2s 11ms/step - loss: 0.5935 - accuracy: 0.6909 - val_loss: 0.5950 - val_accuracy: 0.6940 Epoch 15/40 205/205 [==============================] - 2s 11ms/step - loss: 0.5933 - accuracy: 0.6917 - val_loss: 0.5952 - val_accuracy: 0.6920 Epoch 16/40 205/205 [==============================] - 2s 11ms/step - loss: 0.5934 - accuracy: 0.6911 - val_loss: 0.5949 - val_accuracy: 0.6932 Epoch 17/40 205/205 [==============================] - 2s 11ms/step - loss: 0.5933 - accuracy: 0.6906 - val_loss: 0.5948 - val_accuracy: 0.6940 Epoch 18/40 205/205 [==============================] - 2s 11ms/step - loss: 0.5933 - accuracy: 0.6912 - val_loss: 0.5949 - val_accuracy: 0.6916 Epoch 19/40 205/205 [==============================] - 2s 11ms/step - loss: 0.5933 - accuracy: 0.6914 - val_loss: 0.5947 - val_accuracy: 0.6928 Epoch 20/40 205/205 [==============================] - 2s 11ms/step - loss: 0.5932 - accuracy: 0.6905 - val_loss: 0.5947 - val_accuracy: 0.6936 Epoch 21/40 205/205 [==============================] - 2s 11ms/step - loss: 0.5931 - accuracy: 0.6911 - val_loss: 0.5947 - val_accuracy: 0.6922 Epoch 22/40 205/205 [==============================] - 2s 11ms/step - loss: 0.5931 - accuracy: 0.6917 - val_loss: 0.5947 - val_accuracy: 0.6932 Epoch 23/40 205/205 [==============================] - 2s 11ms/step - loss: 0.5932 - accuracy: 0.6918 - val_loss: 0.5946 - val_accuracy: 0.6937 Epoch 24/40 205/205 [==============================] - 2s 11ms/step - loss: 0.5931 - accuracy: 0.6916 - val_loss: 0.5946 - val_accuracy: 0.6936 Epoch 25/40 205/205 [==============================] - 2s 11ms/step - loss: 0.5930 - accuracy: 0.6917 - val_loss: 0.5946 - val_accuracy: 0.6921 Epoch 26/40 205/205 [==============================] - 2s 11ms/step - loss: 0.5930 - accuracy: 0.6918 - val_loss: 0.5946 - val_accuracy: 0.6938 Epoch 27/40 205/205 [==============================] - 2s 11ms/step - loss: 0.5929 - accuracy: 0.6921 - val_loss: 0.5945 - val_accuracy: 0.6936 Epoch 28/40 205/205 [==============================] - 2s 11ms/step - loss: 0.5929 - accuracy: 0.6912 - val_loss: 0.5947 - val_accuracy: 0.6934 Epoch 29/40 205/205 [==============================] - 2s 11ms/step - loss: 0.5930 - accuracy: 0.6919 - val_loss: 0.5946 - val_accuracy: 0.6940 Epoch 30/40 205/205 [==============================] - 2s 11ms/step - loss: 0.5929 - accuracy: 0.6913 - val_loss: 0.5947 - val_accuracy: 0.6941 Epoch 31/40 205/205 [==============================] - 2s 11ms/step - loss: 0.5928 - accuracy: 0.6914 - val_loss: 0.5945 - val_accuracy: 0.6943 Epoch 32/40 205/205 [==============================] - 2s 11ms/step - loss: 0.5929 - accuracy: 0.6917 - val_loss: 0.5945 - val_accuracy: 0.6938 Epoch 33/40 205/205 [==============================] - 2s 11ms/step - loss: 0.5928 - accuracy: 0.6919 - val_loss: 0.5945 - val_accuracy: 0.6940 Epoch 34/40 205/205 [==============================] - 2s 11ms/step - loss: 0.5928 - accuracy: 0.6920 - val_loss: 0.5946 - val_accuracy: 0.6943 Epoch 35/40 205/205 [==============================] - 2s 11ms/step - loss: 0.5927 - accuracy: 0.6922 - val_loss: 0.5944 - val_accuracy: 0.6940 Epoch 36/40 205/205 [==============================] - 2s 11ms/step - loss: 0.5928 - accuracy: 0.6933 - val_loss: 0.5945 - val_accuracy: 0.6942 Epoch 37/40 205/205 [==============================] - 2s 11ms/step - loss: 0.5928 - accuracy: 0.6923 - val_loss: 0.5944 - val_accuracy: 0.6943 Epoch 38/40 205/205 [==============================] - 2s 11ms/step - loss: 0.5927 - accuracy: 0.6918 - val_loss: 0.5944 - val_accuracy: 0.6944 Epoch 39/40 205/205 [==============================] - 2s 11ms/step - loss: 0.5928 - accuracy: 0.6926 - val_loss: 0.5944 - val_accuracy: 0.6936 Epoch 40/40 205/205 [==============================] - 2s 11ms/step - loss: 0.5927 - accuracy: 0.6917 - val_loss: 0.5944 - val_accuracy: 0.6946
Mobilenet_smote_history = Mobilenet_model.fit(trainX_smote, trainY_smote, batch_size=100, epochs=40, validation_data=(testX_smote, testY_smote))
Epoch 1/40 250/250 [==============================] - 5s 14ms/step - loss: 0.6521 - accuracy: 0.6277 - val_loss: 0.6528 - val_accuracy: 0.6248 Epoch 2/40 250/250 [==============================] - 3s 11ms/step - loss: 0.6495 - accuracy: 0.6297 - val_loss: 0.6523 - val_accuracy: 0.6259 Epoch 3/40 250/250 [==============================] - 3s 11ms/step - loss: 0.6492 - accuracy: 0.6303 - val_loss: 0.6524 - val_accuracy: 0.6254 Epoch 4/40 250/250 [==============================] - 3s 11ms/step - loss: 0.6491 - accuracy: 0.6304 - val_loss: 0.6519 - val_accuracy: 0.6278 Epoch 5/40 250/250 [==============================] - 3s 11ms/step - loss: 0.6490 - accuracy: 0.6302 - val_loss: 0.6519 - val_accuracy: 0.6263 Epoch 6/40 250/250 [==============================] - 3s 11ms/step - loss: 0.6489 - accuracy: 0.6307 - val_loss: 0.6517 - val_accuracy: 0.6271 Epoch 7/40 250/250 [==============================] - 3s 11ms/step - loss: 0.6488 - accuracy: 0.6311 - val_loss: 0.6518 - val_accuracy: 0.6265 Epoch 8/40 250/250 [==============================] - 3s 11ms/step - loss: 0.6488 - accuracy: 0.6303 - val_loss: 0.6517 - val_accuracy: 0.6275 Epoch 9/40 250/250 [==============================] - 3s 11ms/step - loss: 0.6488 - accuracy: 0.6308 - val_loss: 0.6517 - val_accuracy: 0.6271 Epoch 10/40 250/250 [==============================] - 3s 11ms/step - loss: 0.6487 - accuracy: 0.6314 - val_loss: 0.6516 - val_accuracy: 0.6284 Epoch 11/40 250/250 [==============================] - 3s 11ms/step - loss: 0.6487 - accuracy: 0.6308 - val_loss: 0.6516 - val_accuracy: 0.6253 Epoch 12/40 250/250 [==============================] - 3s 11ms/step - loss: 0.6487 - accuracy: 0.6311 - val_loss: 0.6515 - val_accuracy: 0.6266 Epoch 13/40 250/250 [==============================] - 3s 11ms/step - loss: 0.6487 - accuracy: 0.6307 - val_loss: 0.6515 - val_accuracy: 0.6283 Epoch 14/40 250/250 [==============================] - 3s 11ms/step - loss: 0.6487 - accuracy: 0.6311 - val_loss: 0.6516 - val_accuracy: 0.6277 Epoch 15/40 250/250 [==============================] - 3s 11ms/step - loss: 0.6486 - accuracy: 0.6313 - val_loss: 0.6515 - val_accuracy: 0.6284 Epoch 16/40 250/250 [==============================] - 3s 11ms/step - loss: 0.6487 - accuracy: 0.6315 - val_loss: 0.6515 - val_accuracy: 0.6294 Epoch 17/40 250/250 [==============================] - 3s 11ms/step - loss: 0.6485 - accuracy: 0.6327 - val_loss: 0.6514 - val_accuracy: 0.6286 Epoch 18/40 250/250 [==============================] - 3s 11ms/step - loss: 0.6484 - accuracy: 0.6320 - val_loss: 0.6516 - val_accuracy: 0.6265 Epoch 19/40 250/250 [==============================] - 3s 11ms/step - loss: 0.6486 - accuracy: 0.6321 - val_loss: 0.6514 - val_accuracy: 0.6275 Epoch 20/40 250/250 [==============================] - 3s 11ms/step - loss: 0.6485 - accuracy: 0.6321 - val_loss: 0.6514 - val_accuracy: 0.6289 Epoch 21/40 250/250 [==============================] - 3s 12ms/step - loss: 0.6484 - accuracy: 0.6311 - val_loss: 0.6519 - val_accuracy: 0.6263 Epoch 22/40 250/250 [==============================] - 3s 12ms/step - loss: 0.6485 - accuracy: 0.6319 - val_loss: 0.6514 - val_accuracy: 0.6279 Epoch 23/40 250/250 [==============================] - 3s 12ms/step - loss: 0.6485 - accuracy: 0.6310 - val_loss: 0.6516 - val_accuracy: 0.6281 Epoch 24/40 250/250 [==============================] - 3s 12ms/step - loss: 0.6486 - accuracy: 0.6321 - val_loss: 0.6514 - val_accuracy: 0.6298 Epoch 25/40 250/250 [==============================] - 3s 11ms/step - loss: 0.6486 - accuracy: 0.6309 - val_loss: 0.6514 - val_accuracy: 0.6283 Epoch 26/40 250/250 [==============================] - 3s 11ms/step - loss: 0.6485 - accuracy: 0.6316 - val_loss: 0.6514 - val_accuracy: 0.6275 Epoch 27/40 250/250 [==============================] - 3s 11ms/step - loss: 0.6485 - accuracy: 0.6308 - val_loss: 0.6514 - val_accuracy: 0.6279 Epoch 28/40 250/250 [==============================] - 3s 11ms/step - loss: 0.6485 - accuracy: 0.6308 - val_loss: 0.6513 - val_accuracy: 0.6292 Epoch 29/40 250/250 [==============================] - 3s 11ms/step - loss: 0.6485 - accuracy: 0.6313 - val_loss: 0.6513 - val_accuracy: 0.6290 Epoch 30/40 250/250 [==============================] - 3s 11ms/step - loss: 0.6485 - accuracy: 0.6319 - val_loss: 0.6513 - val_accuracy: 0.6292 Epoch 31/40 250/250 [==============================] - 3s 11ms/step - loss: 0.6485 - accuracy: 0.6313 - val_loss: 0.6513 - val_accuracy: 0.6299 Epoch 32/40 250/250 [==============================] - 3s 11ms/step - loss: 0.6485 - accuracy: 0.6317 - val_loss: 0.6513 - val_accuracy: 0.6291 Epoch 33/40 250/250 [==============================] - 3s 11ms/step - loss: 0.6485 - accuracy: 0.6317 - val_loss: 0.6515 - val_accuracy: 0.6275 Epoch 34/40 250/250 [==============================] - 3s 11ms/step - loss: 0.6485 - accuracy: 0.6323 - val_loss: 0.6514 - val_accuracy: 0.6284 Epoch 35/40 250/250 [==============================] - 3s 11ms/step - loss: 0.6485 - accuracy: 0.6313 - val_loss: 0.6513 - val_accuracy: 0.6288 Epoch 36/40 250/250 [==============================] - 3s 11ms/step - loss: 0.6484 - accuracy: 0.6322 - val_loss: 0.6513 - val_accuracy: 0.6297 Epoch 37/40 250/250 [==============================] - 3s 11ms/step - loss: 0.6485 - accuracy: 0.6319 - val_loss: 0.6513 - val_accuracy: 0.6285 Epoch 38/40 250/250 [==============================] - 3s 11ms/step - loss: 0.6485 - accuracy: 0.6316 - val_loss: 0.6514 - val_accuracy: 0.6277 Epoch 39/40 250/250 [==============================] - 3s 11ms/step - loss: 0.6485 - accuracy: 0.6317 - val_loss: 0.6513 - val_accuracy: 0.6294 Epoch 40/40 250/250 [==============================] - 3s 11ms/step - loss: 0.6485 - accuracy: 0.6318 - val_loss: 0.6514 - val_accuracy: 0.6280